Compare commits
49 Commits
dark-mode
...
random-stu
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9e859c199e | ||
|
|
7637b94ea7 | ||
|
|
721f1726eb | ||
|
|
cfeb4dfa92 | ||
|
|
21ef67ed4c | ||
|
|
7707d451e0 | ||
|
|
d82782adb4 | ||
|
|
e10589abff | ||
|
|
01dcbfc896 | ||
|
|
50e0b34d30 | ||
|
|
44bb9fc58d | ||
|
|
c0d3784f0c | ||
|
|
e522026b71 | ||
|
|
46b13d85b7 | ||
|
|
c12aa82a3e | ||
|
|
b98bce8944 | ||
|
|
f045c80dfd | ||
|
|
3b460dff2a | ||
|
|
5fa5732804 | ||
|
|
28e6e2b9df | ||
|
|
54d1df4442 | ||
|
|
f69c2b5f23 | ||
|
|
51f0666f6a | ||
|
|
b67d974f4c | ||
|
|
33fb2db981 | ||
|
|
e391379c3e | ||
|
|
8d1609dd52 | ||
|
|
f3380f302d | ||
|
|
3dba9c7ee1 | ||
|
|
e0e4f7a9d6 | ||
|
|
48293dc579 | ||
|
|
38ac6243a0 | ||
|
|
bd2f58e2a5 | ||
|
|
808e47c6b9 | ||
|
|
5945f0ed6b | ||
|
|
6bc7d76d15 | ||
|
|
e9ed173e34 | ||
|
|
75d58d7021 | ||
|
|
896c8c5c57 | ||
|
|
ec5547d0b0 | ||
|
|
77e4e3b8c3 | ||
|
|
a1b03ddad1 | ||
|
|
6be32bea4c | ||
|
|
72c70e2a55 | ||
|
|
026532f2c2 | ||
|
|
f88538336f | ||
|
|
3c7178115e | ||
|
|
292aaf090a | ||
|
|
6982339a1a |
@@ -6,6 +6,10 @@ on:
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
defaults:
|
||||
run:
|
||||
working-directory: app
|
||||
|
||||
jobs:
|
||||
run-checks:
|
||||
runs-on: ubuntu-latest
|
||||
74
README.md
@@ -1,52 +1,62 @@
|
||||
<img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" />
|
||||
<!-- <img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" /> -->
|
||||
|
||||
# OpenPipe
|
||||
|
||||
OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts with realistic sample data.
|
||||
OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts, and can automatically [translate](#-translate-between-model-apis) those prompts between models.
|
||||
|
||||
<img src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="demo">
|
||||
|
||||
You can use our hosted version of OpenPipe at https://openpipe.ai. You can also clone this repository and [run it locally](#running-locally).
|
||||
|
||||
## Sample Experiments
|
||||
|
||||
These are simple experiments users have created that show how OpenPipe works.
|
||||
These are simple experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
|
||||
|
||||
- [Country Capitals](https://app.openpipe.ai/experiments/11111111-1111-1111-1111-111111111111)
|
||||
- [Twitter Sentiment Analysis](https://app.openpipe.ai/experiments/62c20a73-2012-4a64-973c-4b665ad46a57)
|
||||
- [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222)
|
||||
- [OpenAI Function Calls](https://app.openpipe.ai/experiments/2ebbdcb3-ed51-456e-87dc-91f72eaf3e2b)
|
||||
- [Activity Classification](https://app.openpipe.ai/experiments/3950940f-ab6b-4b74-841d-7e9dbc4e4ff8)
|
||||
|
||||
<img src="https://github.com/openpipe/openpipe/assets/176426/fc7624c6-5b65-4d4d-82b7-4a816f3e5678" alt="demo" height="400px">
|
||||
|
||||
You can use our hosted version of OpenPipe at [https://openpipe.ai]. You can also clone this repository and [run it locally](#running-locally).
|
||||
|
||||
## High-Level Features
|
||||
|
||||
**Configure Multiple Prompts**
|
||||
Set up multiple prompt configurations and compare their output side-by-side. Each configuration can be configured independently.
|
||||
|
||||
**Visualize Responses**
|
||||
Inspect prompt completions side-by-side.
|
||||
|
||||
**Test Many Inputs**
|
||||
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broader coverage of your problem space than you'd get with manual testing.
|
||||
|
||||
**🪄 Auto-generate Test Scenarios**
|
||||
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
|
||||
|
||||
**Prompt Validation and Typeahead**
|
||||
We use OpenAI's OpenAPI spec to automatically provide typeahead and validate prompts.
|
||||
|
||||
<img alt="typeahead" src="https://github.com/openpipe/openpipe/assets/176426/acc638f8-d851-4742-8d01-fe6f98890840" height="300px">
|
||||
|
||||
**Function Call Support**
|
||||
Natively supports [OpenAI function calls](https://openai.com/blog/function-calling-and-other-api-updates) on supported models.
|
||||
|
||||
<img height="300px" alt="function calls" src="https://github.com/openpipe/openpipe/assets/176426/48ad13fe-af2f-4294-bf32-62015597fd9b">
|
||||
|
||||
## Supported Models
|
||||
|
||||
- All models available through the OpenAI [chat completion API](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
|
||||
- Llama2 [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat), [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat), [70b chat](https://replicate.com/replicate/llama70b-v2-chat).
|
||||
- Anthropic's [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude) and [Claude 2](https://www.anthropic.com/index/claude-2)
|
||||
|
||||
## Features
|
||||
|
||||
### 🔍 Visualize Responses
|
||||
|
||||
Inspect prompt completions side-by-side.
|
||||
|
||||
### 🧪 Bulk-Test
|
||||
|
||||
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broad coverage of your problem space.
|
||||
|
||||
### 📟 Translate between Model APIs
|
||||
|
||||
Write your prompt in one format and automatically convert it to work with any other model.
|
||||
|
||||
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/1e19ccf2-96b6-4e93-a3a5-1449710d1b5b" alt="translate between models">
|
||||
|
||||
<br><br>
|
||||
|
||||
### 🛠️ Refine Your Prompts Automatically
|
||||
|
||||
Use a growing database of best-practice refinements to improve your prompts automatically.
|
||||
|
||||
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/87a27fe7-daef-445c-a5e2-1c82b23f9f99" alt="add function call">
|
||||
|
||||
<br><br>
|
||||
|
||||
### 🪄 Auto-generate Test Scenarios
|
||||
|
||||
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
|
||||
|
||||
<img width="600" src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="auto-generate">
|
||||
|
||||
<br><br>
|
||||
|
||||
## Running Locally
|
||||
|
||||
1. Install [Postgresql](https://www.postgresql.org/download/).
|
||||
|
||||
@@ -26,6 +26,8 @@ NEXT_PUBLIC_SOCKET_URL="http://localhost:3318"
|
||||
NEXTAUTH_SECRET="your_secret"
|
||||
NEXTAUTH_URL="http://localhost:3000"
|
||||
|
||||
NEXT_PUBLIC_HOST="http://localhost:3000"
|
||||
|
||||
# Next Auth Github Provider
|
||||
GITHUB_CLIENT_ID="your_client_id"
|
||||
GITHUB_CLIENT_SECRET="your_secret"
|
||||
3
.gitignore → app/.gitignore
vendored
@@ -40,3 +40,6 @@ yarn-error.log*
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
|
||||
# Sentry Auth Token
|
||||
.sentryclirc
|
||||
@@ -14,10 +14,14 @@ declare module "nextjs-routes" {
|
||||
| StaticRoute<"/account/signin">
|
||||
| DynamicRoute<"/api/auth/[...nextauth]", { "nextauth": string[] }>
|
||||
| StaticRoute<"/api/experiments/og-image">
|
||||
| StaticRoute<"/api/sentry-example-api">
|
||||
| DynamicRoute<"/api/trpc/[trpc]", { "trpc": string }>
|
||||
| DynamicRoute<"/data/[id]", { "id": string }>
|
||||
| StaticRoute<"/data">
|
||||
| DynamicRoute<"/experiments/[id]", { "id": string }>
|
||||
| StaticRoute<"/experiments">
|
||||
| StaticRoute<"/">
|
||||
| StaticRoute<"/sentry-example-page">
|
||||
| StaticRoute<"/world-champs">
|
||||
| StaticRoute<"/world-champs/signup">;
|
||||
|
||||
@@ -20,6 +20,9 @@ FROM base as builder
|
||||
# Include all NEXT_PUBLIC_* env vars here
|
||||
ARG NEXT_PUBLIC_POSTHOG_KEY
|
||||
ARG NEXT_PUBLIC_SOCKET_URL
|
||||
ARG NEXT_PUBLIC_HOST
|
||||
ARG NEXT_PUBLIC_SENTRY_DSN
|
||||
ARG SENTRY_AUTH_TOKEN
|
||||
|
||||
WORKDIR /app
|
||||
COPY --from=deps /app/node_modules ./node_modules
|
||||
61
app/next.config.mjs
Normal file
@@ -0,0 +1,61 @@
|
||||
import nextRoutes from "nextjs-routes/config";
|
||||
import { withSentryConfig } from "@sentry/nextjs";
|
||||
|
||||
/**
|
||||
* Run `build` or `dev` with `SKIP_ENV_VALIDATION` to skip env validation. This is especially useful
|
||||
* for Docker builds.
|
||||
*/
|
||||
const { env } = await import("./src/env.mjs");
|
||||
|
||||
/** @type {import("next").NextConfig} */
|
||||
let config = {
|
||||
reactStrictMode: true,
|
||||
|
||||
/**
|
||||
* If you have `experimental: { appDir: true }` set, then you must comment the below `i18n` config
|
||||
* out.
|
||||
*
|
||||
* @see https://github.com/vercel/next.js/issues/41980
|
||||
*/
|
||||
i18n: {
|
||||
locales: ["en"],
|
||||
defaultLocale: "en",
|
||||
},
|
||||
|
||||
rewrites: async () => [
|
||||
{
|
||||
source: "/ingest/:path*",
|
||||
destination: "https://app.posthog.com/:path*",
|
||||
},
|
||||
],
|
||||
|
||||
webpack: (config) => {
|
||||
config.module.rules.push({
|
||||
test: /\.txt$/,
|
||||
use: "raw-loader",
|
||||
});
|
||||
return config;
|
||||
},
|
||||
};
|
||||
|
||||
config = nextRoutes()(config);
|
||||
|
||||
if (env.NEXT_PUBLIC_SENTRY_DSN && env.SENTRY_AUTH_TOKEN) {
|
||||
// @ts-expect-error - `withSentryConfig` is not typed correctly
|
||||
config = withSentryConfig(
|
||||
config,
|
||||
{
|
||||
authToken: env.SENTRY_AUTH_TOKEN,
|
||||
silent: true,
|
||||
org: "openpipe",
|
||||
project: "openpipe",
|
||||
},
|
||||
{
|
||||
widenClientFileUpload: true,
|
||||
tunnelRoute: "/monitoring",
|
||||
disableLogger: true,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
export default config;
|
||||
@@ -18,15 +18,18 @@
|
||||
"start": "next start",
|
||||
"codegen": "tsx src/codegen/export-openai-types.ts",
|
||||
"seed": "tsx prisma/seed.ts",
|
||||
"check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'"
|
||||
"check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'",
|
||||
"test": "pnpm vitest --no-threads"
|
||||
},
|
||||
"dependencies": {
|
||||
"@anthropic-ai/sdk": "^0.5.8",
|
||||
"@apidevtools/json-schema-ref-parser": "^10.1.0",
|
||||
"@babel/preset-typescript": "^7.22.5",
|
||||
"@babel/standalone": "^7.22.9",
|
||||
"@chakra-ui/anatomy": "^2.2.0",
|
||||
"@chakra-ui/next-js": "^2.1.4",
|
||||
"@chakra-ui/react": "^2.7.1",
|
||||
"@chakra-ui/styled-system": "^2.9.1",
|
||||
"@emotion/react": "^11.11.1",
|
||||
"@emotion/server": "^11.11.0",
|
||||
"@emotion/styled": "^11.11.0",
|
||||
@@ -34,6 +37,7 @@
|
||||
"@monaco-editor/loader": "^1.3.3",
|
||||
"@next-auth/prisma-adapter": "^1.0.5",
|
||||
"@prisma/client": "^4.14.0",
|
||||
"@sentry/nextjs": "^7.61.0",
|
||||
"@t3-oss/env-nextjs": "^0.3.1",
|
||||
"@tabler/icons-react": "^2.22.0",
|
||||
"@tanstack/react-query": "^4.29.7",
|
||||
@@ -63,15 +67,18 @@
|
||||
"next-auth": "^4.22.1",
|
||||
"next-query-params": "^4.2.3",
|
||||
"nextjs-routes": "^2.0.1",
|
||||
"openai": "4.0.0-beta.2",
|
||||
"openai": "4.0.0-beta.7",
|
||||
"pluralize": "^8.0.0",
|
||||
"posthog-js": "^1.68.4",
|
||||
"posthog-js": "^1.75.3",
|
||||
"posthog-node": "^3.1.1",
|
||||
"prettier": "^3.0.0",
|
||||
"prismjs": "^1.29.0",
|
||||
"react": "18.2.0",
|
||||
"react-diff-viewer": "^3.1.1",
|
||||
"react-dom": "18.2.0",
|
||||
"react-github-btn": "^1.4.0",
|
||||
"react-icons": "^4.10.1",
|
||||
"react-json-tree": "^0.18.0",
|
||||
"react-select": "^5.7.4",
|
||||
"react-syntax-highlighter": "^15.5.0",
|
||||
"react-textarea-autosize": "^8.5.0",
|
||||
579
pnpm-lock.yaml → app/pnpm-lock.yaml
generated
@@ -0,0 +1,5 @@
|
||||
-- CreateEnum
|
||||
CREATE TYPE "UserRole" AS ENUM ('ADMIN', 'USER');
|
||||
|
||||
-- AlterTable
|
||||
ALTER TABLE "User" ADD COLUMN "role" "UserRole" NOT NULL DEFAULT 'USER';
|
||||
@@ -0,0 +1,28 @@
|
||||
-- CreateTable
|
||||
CREATE TABLE "Dataset" (
|
||||
"id" UUID NOT NULL,
|
||||
"name" TEXT NOT NULL,
|
||||
"organizationId" UUID NOT NULL,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "Dataset_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateTable
|
||||
CREATE TABLE "DatasetEntry" (
|
||||
"id" UUID NOT NULL,
|
||||
"input" TEXT NOT NULL,
|
||||
"output" TEXT,
|
||||
"datasetId" UUID NOT NULL,
|
||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "DatasetEntry_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "Dataset" ADD CONSTRAINT "Dataset_organizationId_fkey" FOREIGN KEY ("organizationId") REFERENCES "Organization"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "DatasetEntry" ADD CONSTRAINT "DatasetEntry_datasetId_fkey" FOREIGN KEY ("datasetId") REFERENCES "Dataset"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
@@ -0,0 +1,13 @@
|
||||
/*
|
||||
Warnings:
|
||||
|
||||
- You are about to drop the column `constructFn` on the `PromptVariant` table. All the data in the column will be lost.
|
||||
- You are about to drop the column `constructFnVersion` on the `PromptVariant` table. All the data in the column will be lost.
|
||||
- Added the required column `promptConstructor` to the `PromptVariant` table without a default value. This is not possible if the table is not empty.
|
||||
- Added the required column `promptConstructorVersion` to the `PromptVariant` table without a default value. This is not possible if the table is not empty.
|
||||
|
||||
*/
|
||||
-- AlterTable
|
||||
|
||||
ALTER TABLE "PromptVariant" RENAME COLUMN "constructFn" TO "promptConstructor";
|
||||
ALTER TABLE "PromptVariant" RENAME COLUMN "constructFnVersion" TO "promptConstructorVersion";
|
||||
@@ -31,11 +31,11 @@ model Experiment {
|
||||
model PromptVariant {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
label String
|
||||
constructFn String
|
||||
constructFnVersion Int
|
||||
model String
|
||||
modelProvider String
|
||||
label String
|
||||
promptConstructor String
|
||||
promptConstructorVersion Int
|
||||
model String
|
||||
modelProvider String
|
||||
|
||||
uiId String @default(uuid()) @db.Uuid
|
||||
visible Boolean @default(true)
|
||||
@@ -174,6 +174,32 @@ model OutputEvaluation {
|
||||
@@unique([modelResponseId, evaluationId])
|
||||
}
|
||||
|
||||
model Dataset {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
name String
|
||||
datasetEntries DatasetEntry[]
|
||||
|
||||
organizationId String @db.Uuid
|
||||
organization Organization @relation(fields: [organizationId], references: [id], onDelete: Cascade)
|
||||
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
}
|
||||
|
||||
model DatasetEntry {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
input String
|
||||
output String?
|
||||
|
||||
datasetId String @db.Uuid
|
||||
dataset Dataset? @relation(fields: [datasetId], references: [id], onDelete: Cascade)
|
||||
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
}
|
||||
|
||||
model Organization {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
personalOrgUserId String? @unique @db.Uuid
|
||||
@@ -183,6 +209,9 @@ model Organization {
|
||||
updatedAt DateTime @updatedAt
|
||||
organizationUsers OrganizationUser[]
|
||||
experiments Experiment[]
|
||||
datasets Dataset[]
|
||||
loggedCalls LoggedCall[]
|
||||
apiKeys ApiKey[]
|
||||
}
|
||||
|
||||
enum OrganizationUserRole {
|
||||
@@ -222,6 +251,99 @@ model WorldChampEntrant {
|
||||
@@unique([userId])
|
||||
}
|
||||
|
||||
model LoggedCall {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
startTime DateTime
|
||||
|
||||
// True if this call was served from the cache, false otherwise
|
||||
cacheHit Boolean
|
||||
|
||||
// A LoggedCall is always associated with a LoggedCallModelResponse. If this
|
||||
// is a cache miss, it's a new LoggedCallModelResponse we created for this.
|
||||
// If it's a cache hit, it's the existing LoggedCallModelResponse we served.
|
||||
modelResponseId String @db.Uuid
|
||||
modelResponse LoggedCallModelResponse @relation(fields: [modelResponseId], references: [id], onDelete: Cascade)
|
||||
|
||||
// The response created by this LoggedCall. Will be null if this LoggedCall is a cache hit.
|
||||
createdResponse LoggedCallModelResponse[] @relation(name: "ModelResponseCreatedBy")
|
||||
|
||||
organizationId String @db.Uuid
|
||||
organization Organization? @relation(fields: [organizationId], references: [id], onDelete: Cascade)
|
||||
|
||||
tags LoggedCallTag[]
|
||||
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
|
||||
@@index([startTime])
|
||||
}
|
||||
|
||||
model LoggedCallModelResponse {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
reqPayload Json
|
||||
|
||||
// The HTTP status returned by the model provider
|
||||
respStatus Int?
|
||||
respPayload Json?
|
||||
|
||||
// Should be null if the request was successful, and some string if the request failed.
|
||||
error String?
|
||||
|
||||
startTime DateTime
|
||||
endTime DateTime
|
||||
|
||||
// Note: the function to calculate the cacheKey should include the project
|
||||
// ID so we don't share cached responses between projects, which could be an
|
||||
// attack vector. Also, we should only set the cacheKey on the model if the
|
||||
// request was successful.
|
||||
cacheKey String?
|
||||
|
||||
// Derived fields
|
||||
durationMs Int?
|
||||
inputTokens Int?
|
||||
outputTokens Int?
|
||||
finishReason String?
|
||||
completionId String?
|
||||
totalCost Decimal? @db.Decimal(18, 12)
|
||||
|
||||
// The LoggedCall that created this LoggedCallModelResponse
|
||||
createdById String @unique @db.Uuid
|
||||
createdBy LoggedCall @relation(name: "ModelResponseCreatedBy", fields: [createdById], references: [id], onDelete: Cascade)
|
||||
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
loggedCalls LoggedCall[]
|
||||
|
||||
@@index([cacheKey])
|
||||
}
|
||||
|
||||
model LoggedCallTag {
|
||||
id String @id @default(cuid())
|
||||
name String
|
||||
value String?
|
||||
|
||||
loggedCallId String
|
||||
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@index([name])
|
||||
@@index([name, value])
|
||||
}
|
||||
|
||||
model ApiKey {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
|
||||
name String
|
||||
apiKey String @unique
|
||||
|
||||
organizationId String @db.Uuid
|
||||
organization Organization? @relation(fields: [organizationId], references: [id], onDelete: Cascade)
|
||||
|
||||
createdAt DateTime @default(now())
|
||||
updatedAt DateTime @updatedAt
|
||||
}
|
||||
|
||||
model Account {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
userId String @db.Uuid
|
||||
@@ -249,12 +371,20 @@ model Session {
|
||||
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
|
||||
}
|
||||
|
||||
enum UserRole {
|
||||
ADMIN
|
||||
USER
|
||||
}
|
||||
|
||||
model User {
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
name String?
|
||||
email String? @unique
|
||||
emailVerified DateTime?
|
||||
image String?
|
||||
id String @id @default(uuid()) @db.Uuid
|
||||
name String?
|
||||
email String? @unique
|
||||
emailVerified DateTime?
|
||||
image String?
|
||||
|
||||
role UserRole @default(USER)
|
||||
|
||||
accounts Account[]
|
||||
sessions Session[]
|
||||
organizationUsers OrganizationUser[]
|
||||
@@ -1,6 +1,7 @@
|
||||
import { prisma } from "~/server/db";
|
||||
import dedent from "dedent";
|
||||
import { generateNewCell } from "~/server/utils/generateNewCell";
|
||||
import { promptConstructorVersion } from "~/promptConstructor/version";
|
||||
|
||||
const defaultId = "11111111-1111-1111-1111-111111111111";
|
||||
|
||||
@@ -51,8 +52,8 @@ await prisma.promptVariant.createMany({
|
||||
sortIndex: 0,
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
modelProvider: "openai/ChatCompletion",
|
||||
constructFnVersion: 1,
|
||||
constructFn: dedent`
|
||||
promptConstructorVersion,
|
||||
promptConstructor: dedent`
|
||||
definePrompt("openai/ChatCompletion", {
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
messages: [
|
||||
@@ -70,8 +71,8 @@ await prisma.promptVariant.createMany({
|
||||
sortIndex: 1,
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
modelProvider: "openai/ChatCompletion",
|
||||
constructFnVersion: 1,
|
||||
constructFn: dedent`
|
||||
promptConstructorVersion,
|
||||
promptConstructor: dedent`
|
||||
definePrompt("openai/ChatCompletion", {
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
messages: [
|
||||
@@ -3,6 +3,7 @@ import { generateNewCell } from "~/server/utils/generateNewCell";
|
||||
import dedent from "dedent";
|
||||
import { execSync } from "child_process";
|
||||
import fs from "fs";
|
||||
import { promptConstructorVersion } from "~/promptConstructor/version";
|
||||
|
||||
const defaultId = "11111111-1111-1111-1111-111111111112";
|
||||
|
||||
@@ -98,8 +99,8 @@ for (const dataset of datasets) {
|
||||
sortIndex: 0,
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
modelProvider: "openai/ChatCompletion",
|
||||
constructFnVersion: 1,
|
||||
constructFn: dedent`
|
||||
promptConstructorVersion,
|
||||
promptConstructor: dedent`
|
||||
definePrompt("openai/ChatCompletion", {
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
messages: [
|
||||
@@ -2,6 +2,7 @@ import { prisma } from "~/server/db";
|
||||
import dedent from "dedent";
|
||||
import fs from "fs";
|
||||
import { parse } from "csv-parse/sync";
|
||||
import { promptConstructorVersion } from "~/promptConstructor/version";
|
||||
|
||||
const defaultId = "11111111-1111-1111-1111-111111111112";
|
||||
|
||||
@@ -85,8 +86,8 @@ await prisma.promptVariant.createMany({
|
||||
sortIndex: 0,
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
modelProvider: "openai/ChatCompletion",
|
||||
constructFnVersion: 1,
|
||||
constructFn: dedent`
|
||||
promptConstructorVersion,
|
||||
promptConstructor: dedent`
|
||||
definePrompt("openai/ChatCompletion", {
|
||||
model: "gpt-3.5-turbo-0613",
|
||||
messages: [
|
||||
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 6.8 KiB After Width: | Height: | Size: 6.8 KiB |
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 22 KiB |
|
Before Width: | Height: | Size: 6.1 KiB After Width: | Height: | Size: 6.1 KiB |
|
Before Width: | Height: | Size: 704 B After Width: | Height: | Size: 704 B |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 3.0 KiB After Width: | Height: | Size: 3.0 KiB |
|
Before Width: | Height: | Size: 858 B After Width: | Height: | Size: 858 B |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 1.4 KiB |
|
Before Width: | Height: | Size: 62 KiB After Width: | Height: | Size: 62 KiB |
@@ -5,6 +5,9 @@ set -e
|
||||
echo "Migrating the database"
|
||||
pnpm prisma migrate deploy
|
||||
|
||||
echo "Migrating promptConstructors"
|
||||
pnpm tsx src/promptConstructor/migrate.ts
|
||||
|
||||
echo "Starting the server"
|
||||
|
||||
pnpm concurrently --kill-others \
|
||||
33
app/sentry.client.config.ts
Normal file
@@ -0,0 +1,33 @@
|
||||
// This file configures the initialization of Sentry on the client.
|
||||
// The config you add here will be used whenever a users loads a page in their browser.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { env } from "~/env.mjs";
|
||||
|
||||
if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
||||
Sentry.init({
|
||||
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
|
||||
|
||||
// Adjust this value in production, or use tracesSampler for greater control
|
||||
tracesSampleRate: 1,
|
||||
|
||||
// Setting this option to true will print useful information to the console while you're setting up Sentry.
|
||||
debug: false,
|
||||
|
||||
replaysOnErrorSampleRate: 1.0,
|
||||
|
||||
// This sets the sample rate to be 10%. You may want this to be 100% while
|
||||
// in development and sample at a lower rate in production
|
||||
replaysSessionSampleRate: 0.1,
|
||||
|
||||
// You can remove this option if you're not planning to use the Sentry Session Replay feature:
|
||||
integrations: [
|
||||
new Sentry.Replay({
|
||||
// Additional Replay configuration goes in here, for example:
|
||||
maskAllText: true,
|
||||
blockAllMedia: true,
|
||||
}),
|
||||
],
|
||||
});
|
||||
}
|
||||
19
app/sentry.edge.config.ts
Normal file
@@ -0,0 +1,19 @@
|
||||
// This file configures the initialization of Sentry for edge features (middleware, edge routes, and so on).
|
||||
// The config you add here will be used whenever one of the edge features is loaded.
|
||||
// Note that this config is unrelated to the Vercel Edge Runtime and is also required when running locally.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { env } from "~/env.mjs";
|
||||
|
||||
if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
||||
Sentry.init({
|
||||
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
|
||||
|
||||
// Adjust this value in production, or use tracesSampler for greater control
|
||||
tracesSampleRate: 1,
|
||||
|
||||
// Setting this option to true will print useful information to the console while you're setting up Sentry.
|
||||
debug: false,
|
||||
});
|
||||
}
|
||||
18
app/sentry.server.config.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
// This file configures the initialization of Sentry on the server.
|
||||
// The config you add here will be used whenever the server handles a request.
|
||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||
|
||||
import * as Sentry from "@sentry/nextjs";
|
||||
import { env } from "~/env.mjs";
|
||||
|
||||
if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
||||
Sentry.init({
|
||||
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
|
||||
|
||||
// Adjust this value in production, or use tracesSampler for greater control
|
||||
tracesSampleRate: 1,
|
||||
|
||||
// Setting this option to true will print useful information to the console while you're setting up Sentry.
|
||||
debug: false,
|
||||
});
|
||||
}
|
||||
@@ -68,7 +68,7 @@ export const ChangeModelModal = ({
|
||||
return;
|
||||
await replaceVariantMutation.mutateAsync({
|
||||
id: variant.id,
|
||||
constructFn: modifiedPromptFn,
|
||||
promptConstructor: modifiedPromptFn,
|
||||
streamScenarios: visibleScenarios,
|
||||
});
|
||||
await utils.promptVariants.list.invalidate();
|
||||
@@ -107,7 +107,7 @@ export const ChangeModelModal = ({
|
||||
<ModelSearch selectedModel={selectedModel} setSelectedModel={setSelectedModel} />
|
||||
{isString(modifiedPromptFn) && (
|
||||
<CompareFunctions
|
||||
originalFunction={variant.constructFn}
|
||||
originalFunction={variant.promptConstructor}
|
||||
newFunction={modifiedPromptFn}
|
||||
leftTitle={originalLabel}
|
||||
rightTitle={convertedLabel}
|
||||
@@ -22,8 +22,8 @@ export const ModelSearch = (props: {
|
||||
const [containerRef, containerDimensions] = useElementDimensions();
|
||||
|
||||
return (
|
||||
<VStack ref={containerRef as LegacyRef<HTMLDivElement>} w="full">
|
||||
<Text>Browse Models</Text>
|
||||
<VStack ref={containerRef as LegacyRef<HTMLDivElement>} w="full" fontFamily="inconsolata">
|
||||
<Text fontWeight="bold">Browse Models</Text>
|
||||
<Select<ProviderModel>
|
||||
styles={{ control: (provided) => ({ ...provided, width: containerDimensions?.width }) }}
|
||||
getOptionLabel={(data) => modelLabel(data.provider, data.model)}
|
||||
@@ -23,16 +23,24 @@ export const ModelStatsCard = ({
|
||||
{label}
|
||||
</Text>
|
||||
|
||||
<VStack w="full" spacing={6} bgColor="gray.100" p={4} borderRadius={4}>
|
||||
<VStack
|
||||
w="full"
|
||||
spacing={6}
|
||||
borderWidth={1}
|
||||
borderColor="gray.300"
|
||||
p={4}
|
||||
borderRadius={8}
|
||||
fontFamily="inconsolata"
|
||||
>
|
||||
<HStack w="full" align="flex-start">
|
||||
<Text flex={1} fontSize="lg">
|
||||
<Text as="span" color="gray.600">
|
||||
{model.provider} /{" "}
|
||||
</Text>
|
||||
<VStack flex={1} fontSize="lg" alignItems="flex-start">
|
||||
<Text as="span" fontWeight="bold" color="gray.900">
|
||||
{model.name}
|
||||
</Text>
|
||||
</Text>
|
||||
<Text as="span" color="gray.600" fontSize="sm">
|
||||
Provider: {model.provider}
|
||||
</Text>
|
||||
</VStack>
|
||||
<Link
|
||||
href={model.learnMoreUrl}
|
||||
isExternal
|
||||
@@ -1,18 +1,29 @@
|
||||
import { Button, Spinner, InputGroup, InputRightElement, Icon, HStack } from "@chakra-ui/react";
|
||||
import {
|
||||
Button,
|
||||
Spinner,
|
||||
InputGroup,
|
||||
InputRightElement,
|
||||
Icon,
|
||||
HStack,
|
||||
type InputGroupProps,
|
||||
} from "@chakra-ui/react";
|
||||
import { IoMdSend } from "react-icons/io";
|
||||
import AutoResizeTextArea from "../AutoResizeTextArea";
|
||||
import AutoResizeTextArea from "./AutoResizeTextArea";
|
||||
|
||||
export const CustomInstructionsInput = ({
|
||||
instructions,
|
||||
setInstructions,
|
||||
loading,
|
||||
onSubmit,
|
||||
placeholder = "Send custom instructions",
|
||||
...props
|
||||
}: {
|
||||
instructions: string;
|
||||
setInstructions: (instructions: string) => void;
|
||||
loading: boolean;
|
||||
onSubmit: () => void;
|
||||
}) => {
|
||||
placeholder?: string;
|
||||
} & InputGroupProps) => {
|
||||
return (
|
||||
<InputGroup
|
||||
size="md"
|
||||
@@ -22,6 +33,7 @@ export const CustomInstructionsInput = ({
|
||||
borderRadius={8}
|
||||
alignItems="center"
|
||||
colorScheme="orange"
|
||||
{...props}
|
||||
>
|
||||
<AutoResizeTextArea
|
||||
value={instructions}
|
||||
@@ -33,7 +45,7 @@ export const CustomInstructionsInput = ({
|
||||
onSubmit();
|
||||
}
|
||||
}}
|
||||
placeholder="Send custom instructions"
|
||||
placeholder={placeholder}
|
||||
py={4}
|
||||
pl={4}
|
||||
pr={12}
|
||||
@@ -1,17 +1,17 @@
|
||||
import { api } from "~/utils/api";
|
||||
import { type PromptVariant, type Scenario } from "../types";
|
||||
import { Text, VStack } from "@chakra-ui/react";
|
||||
import { type StackProps, Text, VStack } from "@chakra-ui/react";
|
||||
import { useExperiment, useHandledAsyncCallback } from "~/utils/hooks";
|
||||
import SyntaxHighlighter from "react-syntax-highlighter";
|
||||
import { docco } from "react-syntax-highlighter/dist/cjs/styles/hljs";
|
||||
import stringify from "json-stringify-pretty-compact";
|
||||
import { type ReactElement, useState, useEffect, Fragment } from "react";
|
||||
import { type ReactElement, useState, useEffect, Fragment, useCallback } from "react";
|
||||
import useSocket from "~/utils/useSocket";
|
||||
import { OutputStats } from "./OutputStats";
|
||||
import { RetryCountdown } from "./RetryCountdown";
|
||||
import frontendModelProviders from "~/modelProviders/frontendModelProviders";
|
||||
import { ResponseLog } from "./ResponseLog";
|
||||
import { CellContent } from "./CellContent";
|
||||
import { CellOptions } from "./TopActions";
|
||||
|
||||
const WAITING_MESSAGE_INTERVAL = 20000;
|
||||
|
||||
@@ -72,37 +72,49 @@ export default function OutputCell({
|
||||
// TODO: disconnect from socket if we're not streaming anymore
|
||||
const streamedMessage = useSocket<OutputSchema>(cell?.id);
|
||||
|
||||
const mostRecentResponse = cell?.modelResponses[cell.modelResponses.length - 1];
|
||||
|
||||
const CellWrapper = useCallback(
|
||||
({ children, ...props }: StackProps) => (
|
||||
<VStack w="full" alignItems="flex-start" {...props} px={2} py={2} h="100%">
|
||||
{cell && (
|
||||
<CellOptions refetchingOutput={hardRefetching} refetchOutput={hardRefetch} cell={cell} />
|
||||
)}
|
||||
<VStack w="full" alignItems="flex-start" maxH={500} overflowY="auto" flex={1}>
|
||||
{children}
|
||||
</VStack>
|
||||
{mostRecentResponse && (
|
||||
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
|
||||
)}
|
||||
</VStack>
|
||||
),
|
||||
[hardRefetching, hardRefetch, mostRecentResponse, scenario, cell],
|
||||
);
|
||||
|
||||
if (!vars) return null;
|
||||
|
||||
if (!cell && !fetchingOutput)
|
||||
return (
|
||||
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}>
|
||||
<CellWrapper>
|
||||
<Text color="gray.500">Error retrieving output</Text>
|
||||
</CellContent>
|
||||
</CellWrapper>
|
||||
);
|
||||
|
||||
if (cell && cell.errorMessage) {
|
||||
return (
|
||||
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}>
|
||||
<CellWrapper>
|
||||
<Text color="red.500">{cell.errorMessage}</Text>
|
||||
</CellContent>
|
||||
</CellWrapper>
|
||||
);
|
||||
}
|
||||
|
||||
if (disabledReason) return <Text color="gray.500">{disabledReason}</Text>;
|
||||
|
||||
const mostRecentResponse = cell?.modelResponses[cell.modelResponses.length - 1];
|
||||
const showLogs = !streamedMessage && !mostRecentResponse?.output;
|
||||
|
||||
if (showLogs)
|
||||
return (
|
||||
<CellContent
|
||||
hardRefetching={hardRefetching}
|
||||
hardRefetch={hardRefetch}
|
||||
alignItems="flex-start"
|
||||
fontFamily="inconsolata, monospace"
|
||||
spacing={0}
|
||||
>
|
||||
<CellWrapper alignItems="flex-start" fontFamily="inconsolata, monospace" spacing={0}>
|
||||
{cell?.jobQueuedAt && <ResponseLog time={cell.jobQueuedAt} title="Job queued" />}
|
||||
{cell?.jobStartedAt && <ResponseLog time={cell.jobStartedAt} title="Job started" />}
|
||||
{cell?.modelResponses?.map((response) => {
|
||||
@@ -124,9 +136,11 @@ export default function OutputCell({
|
||||
Array.from({ length: numWaitingMessages }, (_, i) => (
|
||||
<ResponseLog
|
||||
key={`waiting-${i}`}
|
||||
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
|
||||
time={
|
||||
new Date(response.requestedAt!.getTime() + (i + 1) * WAITING_MESSAGE_INTERVAL)
|
||||
new Date(
|
||||
(response.requestedAt?.getTime?.() ?? 0) +
|
||||
(i + 1) * WAITING_MESSAGE_INTERVAL,
|
||||
)
|
||||
}
|
||||
title="Waiting for response..."
|
||||
/>
|
||||
@@ -146,7 +160,7 @@ export default function OutputCell({
|
||||
{mostRecentResponse?.retryTime && (
|
||||
<RetryCountdown retryTime={mostRecentResponse.retryTime} />
|
||||
)}
|
||||
</CellContent>
|
||||
</CellWrapper>
|
||||
);
|
||||
|
||||
const normalizedOutput = mostRecentResponse?.output
|
||||
@@ -157,50 +171,27 @@ export default function OutputCell({
|
||||
|
||||
if (mostRecentResponse?.output && normalizedOutput?.type === "json") {
|
||||
return (
|
||||
<VStack
|
||||
w="100%"
|
||||
h="100%"
|
||||
fontSize="xs"
|
||||
flexWrap="wrap"
|
||||
overflowX="hidden"
|
||||
justifyContent="space-between"
|
||||
>
|
||||
<CellContent
|
||||
hardRefetching={hardRefetching}
|
||||
hardRefetch={hardRefetch}
|
||||
w="full"
|
||||
flex={1}
|
||||
spacing={0}
|
||||
<CellWrapper>
|
||||
<SyntaxHighlighter
|
||||
customStyle={{ overflowX: "unset", width: "100%", flex: 1 }}
|
||||
language="json"
|
||||
style={docco}
|
||||
lineProps={{
|
||||
style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
|
||||
}}
|
||||
wrapLines
|
||||
>
|
||||
<SyntaxHighlighter
|
||||
customStyle={{ overflowX: "unset", width: "100%", flex: 1 }}
|
||||
language="json"
|
||||
style={docco}
|
||||
lineProps={{
|
||||
style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
|
||||
}}
|
||||
wrapLines
|
||||
>
|
||||
{stringify(normalizedOutput.value, { maxLength: 40 })}
|
||||
</SyntaxHighlighter>
|
||||
</CellContent>
|
||||
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
|
||||
</VStack>
|
||||
{stringify(normalizedOutput.value, { maxLength: 40 })}
|
||||
</SyntaxHighlighter>
|
||||
</CellWrapper>
|
||||
);
|
||||
}
|
||||
|
||||
const contentToDisplay = (normalizedOutput?.type === "text" && normalizedOutput.value) || "";
|
||||
|
||||
return (
|
||||
<VStack w="100%" h="100%" justifyContent="space-between" whiteSpace="pre-wrap">
|
||||
<VStack w="full" alignItems="flex-start" spacing={0}>
|
||||
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}>
|
||||
<Text>{contentToDisplay}</Text>
|
||||
</CellContent>
|
||||
</VStack>
|
||||
{mostRecentResponse?.output && (
|
||||
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
|
||||
)}
|
||||
</VStack>
|
||||
<CellWrapper>
|
||||
<Text>{contentToDisplay}</Text>
|
||||
</CellWrapper>
|
||||
);
|
||||
}
|
||||
@@ -23,8 +23,15 @@ export const OutputStats = ({
|
||||
const completionTokens = modelResponse.completionTokens;
|
||||
|
||||
return (
|
||||
<HStack w="full" align="center" color="gray.500" fontSize="2xs" mt={{ base: 0, md: 1 }}>
|
||||
<HStack flex={1}>
|
||||
<HStack
|
||||
w="full"
|
||||
align="center"
|
||||
color="gray.500"
|
||||
fontSize="2xs"
|
||||
mt={{ base: 0, md: 1 }}
|
||||
alignItems="flex-end"
|
||||
>
|
||||
<HStack flex={1} flexWrap="wrap">
|
||||
{modelResponse.outputEvaluations.map((evaluation) => {
|
||||
const passed = evaluation.result > 0.5;
|
||||
return (
|
||||
36
app/src/components/OutputsTable/OutputCell/PromptModal.tsx
Normal file
@@ -0,0 +1,36 @@
|
||||
import {
|
||||
Modal,
|
||||
ModalBody,
|
||||
ModalCloseButton,
|
||||
ModalContent,
|
||||
ModalHeader,
|
||||
ModalOverlay,
|
||||
type UseDisclosureReturn,
|
||||
} from "@chakra-ui/react";
|
||||
import { type RouterOutputs } from "~/utils/api";
|
||||
import { JSONTree } from "react-json-tree";
|
||||
|
||||
export default function ExpandedModal(props: {
|
||||
cell: NonNullable<RouterOutputs["scenarioVariantCells"]["get"]>;
|
||||
disclosure: UseDisclosureReturn;
|
||||
}) {
|
||||
return (
|
||||
<Modal isOpen={props.disclosure.isOpen} onClose={props.disclosure.onClose} size="2xl">
|
||||
<ModalOverlay />
|
||||
<ModalContent>
|
||||
<ModalHeader>Prompt</ModalHeader>
|
||||
<ModalCloseButton />
|
||||
<ModalBody>
|
||||
<JSONTree
|
||||
data={props.cell.prompt}
|
||||
invertTheme={true}
|
||||
theme="chalk"
|
||||
shouldExpandNodeInitially={() => true}
|
||||
getItemString={() => ""}
|
||||
hideRoot
|
||||
/>
|
||||
</ModalBody>
|
||||
</ModalContent>
|
||||
</Modal>
|
||||
);
|
||||
}
|
||||
53
app/src/components/OutputsTable/OutputCell/TopActions.tsx
Normal file
@@ -0,0 +1,53 @@
|
||||
import { HStack, Icon, IconButton, Spinner, Tooltip, useDisclosure } from "@chakra-ui/react";
|
||||
import { BsArrowClockwise, BsInfoCircle } from "react-icons/bs";
|
||||
import { useExperimentAccess } from "~/utils/hooks";
|
||||
import ExpandedModal from "./PromptModal";
|
||||
import { type RouterOutputs } from "~/utils/api";
|
||||
|
||||
export const CellOptions = ({
|
||||
cell,
|
||||
refetchingOutput,
|
||||
refetchOutput,
|
||||
}: {
|
||||
cell: RouterOutputs["scenarioVariantCells"]["get"];
|
||||
refetchingOutput: boolean;
|
||||
refetchOutput: () => void;
|
||||
}) => {
|
||||
const { canModify } = useExperimentAccess();
|
||||
|
||||
const modalDisclosure = useDisclosure();
|
||||
|
||||
return (
|
||||
<HStack justifyContent="flex-end" w="full">
|
||||
{cell && (
|
||||
<>
|
||||
<Tooltip label="See Prompt">
|
||||
<IconButton
|
||||
aria-label="See Prompt"
|
||||
icon={<Icon as={BsInfoCircle} boxSize={4} />}
|
||||
onClick={modalDisclosure.onOpen}
|
||||
size="xs"
|
||||
colorScheme="gray"
|
||||
color="gray.500"
|
||||
variant="ghost"
|
||||
/>
|
||||
</Tooltip>
|
||||
<ExpandedModal cell={cell} disclosure={modalDisclosure} />
|
||||
</>
|
||||
)}
|
||||
{canModify && (
|
||||
<Tooltip label="Refetch output">
|
||||
<IconButton
|
||||
size="xs"
|
||||
color="gray.500"
|
||||
variant="ghost"
|
||||
cursor="pointer"
|
||||
onClick={refetchOutput}
|
||||
aria-label="refetch output"
|
||||
icon={<Icon as={refetchingOutput ? Spinner : BsArrowClockwise} boxSize={4} />}
|
||||
/>
|
||||
</Tooltip>
|
||||
)}
|
||||
</HStack>
|
||||
);
|
||||
};
|
||||
@@ -1,9 +1,8 @@
|
||||
import { useEffect, type DragEvent } from "react";
|
||||
import { api } from "~/utils/api";
|
||||
import { isEqual } from "lodash-es";
|
||||
import { type Scenario } from "./types";
|
||||
import { useEffect, useState, type DragEvent } from "react";
|
||||
import { api } from "~/utils/api";
|
||||
import { useExperiment, useExperimentAccess, useHandledAsyncCallback } from "~/utils/hooks";
|
||||
import { useState } from "react";
|
||||
import { type Scenario } from "./types";
|
||||
|
||||
import {
|
||||
Box,
|
||||
@@ -12,14 +11,12 @@ import {
|
||||
Icon,
|
||||
IconButton,
|
||||
Spinner,
|
||||
Stack,
|
||||
Text,
|
||||
Tooltip,
|
||||
VStack,
|
||||
Text,
|
||||
} from "@chakra-ui/react";
|
||||
import { cellPadding } from "../constants";
|
||||
import { BsArrowsAngleExpand, BsX } from "react-icons/bs";
|
||||
import { RiDraggable } from "react-icons/ri";
|
||||
import { cellPadding } from "../constants";
|
||||
import { FloatingLabelInput } from "./FloatingLabelInput";
|
||||
import { ScenarioEditorModal } from "./ScenarioEditorModal";
|
||||
|
||||
@@ -115,60 +112,44 @@ export default function ScenarioEditor({
|
||||
onDrop={onReorder}
|
||||
backgroundColor={isDragTarget ? "gray.100" : "transparent"}
|
||||
>
|
||||
{canModify && props.canHide && (
|
||||
<Stack
|
||||
alignSelf="flex-start"
|
||||
opacity={props.hovered ? 1 : 0}
|
||||
spacing={0}
|
||||
ml={-cellPadding.x}
|
||||
>
|
||||
<Tooltip label="Hide scenario" hasArrow>
|
||||
{/* for some reason the tooltip can't position itself properly relative to the icon without the wrapping box */}
|
||||
<Button
|
||||
variant="unstyled"
|
||||
color="gray.400"
|
||||
height="unset"
|
||||
width="unset"
|
||||
minW="unset"
|
||||
onClick={onHide}
|
||||
_hover={{
|
||||
color: "gray.800",
|
||||
cursor: "pointer",
|
||||
}}
|
||||
>
|
||||
<Icon as={hidingInProgress ? Spinner : BsX} boxSize={hidingInProgress ? 4 : 6} />
|
||||
</Button>
|
||||
</Tooltip>
|
||||
<Icon
|
||||
as={RiDraggable}
|
||||
boxSize={6}
|
||||
color="gray.400"
|
||||
_hover={{ color: "gray.800", cursor: "pointer" }}
|
||||
/>
|
||||
</Stack>
|
||||
)}
|
||||
|
||||
{variableLabels.length === 0 ? (
|
||||
<Box color="gray.500">
|
||||
{vars.data ? "No scenario variables configured" : "Loading..."}
|
||||
</Box>
|
||||
) : (
|
||||
<VStack spacing={4} flex={1} py={2}>
|
||||
<HStack justifyContent="space-between" w="100%">
|
||||
<Text color="gray.500">Scenario</Text>
|
||||
<IconButton
|
||||
className="fullscreen-toggle"
|
||||
aria-label="Maximize"
|
||||
icon={<BsArrowsAngleExpand />}
|
||||
onClick={() => setScenarioEditorModalOpen(true)}
|
||||
boxSize={6}
|
||||
borderRadius={4}
|
||||
p={1.5}
|
||||
minW={0}
|
||||
colorScheme="gray"
|
||||
color="gray.500"
|
||||
variant="ghost"
|
||||
/>
|
||||
<HStack justifyContent="space-between" w="100%" align="center" spacing={0}>
|
||||
<Text flex={1}>Scenario</Text>
|
||||
<Tooltip label="Expand" hasArrow>
|
||||
<IconButton
|
||||
aria-label="Expand"
|
||||
icon={<Icon as={BsArrowsAngleExpand} boxSize={3} />}
|
||||
onClick={() => setScenarioEditorModalOpen(true)}
|
||||
size="xs"
|
||||
colorScheme="gray"
|
||||
color="gray.500"
|
||||
variant="ghost"
|
||||
/>
|
||||
</Tooltip>
|
||||
{canModify && props.canHide && (
|
||||
<Tooltip label="Delete" hasArrow>
|
||||
<IconButton
|
||||
aria-label="Delete"
|
||||
icon={
|
||||
<Icon
|
||||
as={hidingInProgress ? Spinner : BsX}
|
||||
boxSize={hidingInProgress ? 4 : 6}
|
||||
/>
|
||||
}
|
||||
onClick={onHide}
|
||||
size="xs"
|
||||
display="flex"
|
||||
colorScheme="gray"
|
||||
color="gray.500"
|
||||
variant="ghost"
|
||||
/>
|
||||
</Tooltip>
|
||||
)}
|
||||
</HStack>
|
||||
{variableLabels.map((key) => {
|
||||
const value = values[key] ?? "";
|
||||
@@ -1,7 +1,6 @@
|
||||
import {
|
||||
Button,
|
||||
HStack,
|
||||
Icon,
|
||||
Modal,
|
||||
ModalBody,
|
||||
ModalCloseButton,
|
||||
@@ -14,7 +13,6 @@ import {
|
||||
VStack,
|
||||
} from "@chakra-ui/react";
|
||||
import { useEffect, useState } from "react";
|
||||
import { BsFileTextFill } from "react-icons/bs";
|
||||
import { isEqual } from "lodash-es";
|
||||
|
||||
import { api } from "~/utils/api";
|
||||
@@ -60,8 +58,6 @@ export const ScenarioEditorModal = ({
|
||||
await utils.scenarios.list.invalidate();
|
||||
}, [mutation, values]);
|
||||
|
||||
console.log("scenario", scenario);
|
||||
|
||||
const vars = api.templateVars.list.useQuery({ experimentId: experiment.data?.id ?? "" });
|
||||
const variableLabels = vars.data?.map((v) => v.label) ?? [];
|
||||
|
||||
@@ -73,12 +69,7 @@ export const ScenarioEditorModal = ({
|
||||
>
|
||||
<ModalOverlay />
|
||||
<ModalContent w={1200}>
|
||||
<ModalHeader>
|
||||
<HStack>
|
||||
<Icon as={BsFileTextFill} />
|
||||
<Text>Scenario</Text>
|
||||
</HStack>
|
||||
</ModalHeader>
|
||||
<ModalHeader />
|
||||
<ModalCloseButton />
|
||||
<ModalBody maxW="unset">
|
||||
<VStack spacing={8}>
|
||||
21
app/src/components/OutputsTable/ScenarioPaginator.tsx
Normal file
@@ -0,0 +1,21 @@
|
||||
import { useScenarios } from "~/utils/hooks";
|
||||
import Paginator from "../Paginator";
|
||||
|
||||
const ScenarioPaginator = () => {
|
||||
const { data } = useScenarios();
|
||||
|
||||
if (!data) return null;
|
||||
|
||||
const { scenarios, startIndex, lastPage, count } = data;
|
||||
|
||||
return (
|
||||
<Paginator
|
||||
numItemsLoaded={scenarios.length}
|
||||
startIndex={startIndex}
|
||||
lastPage={lastPage}
|
||||
count={count}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export default ScenarioPaginator;
|
||||
@@ -1,6 +1,5 @@
|
||||
import { Box, GridItem } from "@chakra-ui/react";
|
||||
import { GridItem } from "@chakra-ui/react";
|
||||
import React, { useState } from "react";
|
||||
import { cellPadding } from "../constants";
|
||||
import OutputCell from "./OutputCell/OutputCell";
|
||||
import ScenarioEditor from "./ScenarioEditor";
|
||||
import type { PromptVariant, Scenario } from "./types";
|
||||
@@ -39,9 +38,7 @@ const ScenarioRow = (props: {
|
||||
colStart={i + 2}
|
||||
{...borders}
|
||||
>
|
||||
<Box h="100%" w="100%" px={cellPadding.x} py={cellPadding.y}>
|
||||
<OutputCell key={variant.id} scenario={props.scenario} variant={variant} />
|
||||
</Box>
|
||||
<OutputCell key={variant.id} scenario={props.scenario} variant={variant} />
|
||||
</GridItem>
|
||||
))}
|
||||
</>
|
||||
@@ -47,7 +47,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
|
||||
return () => window.removeEventListener("keydown", handleEsc);
|
||||
}, [isFullscreen, toggleFullscreen]);
|
||||
|
||||
const lastSavedFn = props.variant.constructFn;
|
||||
const lastSavedFn = props.variant.promptConstructor;
|
||||
|
||||
const modifierKey = useModifierKeyLabel();
|
||||
|
||||
@@ -96,7 +96,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
|
||||
|
||||
const resp = await replaceVariant.mutateAsync({
|
||||
id: props.variant.id,
|
||||
constructFn: currentFn,
|
||||
promptConstructor: currentFn,
|
||||
streamScenarios: visibleScenarios,
|
||||
});
|
||||
if (resp.status === "error") {
|
||||
@@ -43,12 +43,12 @@ export default function VariantStats(props: { variant: PromptVariant }) {
|
||||
return (
|
||||
<HStack
|
||||
justifyContent="space-between"
|
||||
alignItems="center"
|
||||
alignItems="flex-end"
|
||||
mx="2"
|
||||
fontSize="xs"
|
||||
py={cellPadding.y}
|
||||
>
|
||||
<HStack px={cellPadding.x}>
|
||||
<HStack px={cellPadding.x} flexWrap="wrap">
|
||||
{showNumFinished && (
|
||||
<Text>
|
||||
{data.outputCount} / {data.scenarioCount}
|
||||
@@ -35,7 +35,7 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
|
||||
pb={24}
|
||||
pl={8}
|
||||
display="grid"
|
||||
gridTemplateColumns={`250px repeat(${variants.data.length}, minmax(300px, 1fr)) auto`}
|
||||
gridTemplateColumns={`250px repeat(${variants.data.length}, minmax(320px, 1fr)) auto`}
|
||||
sx={{
|
||||
"> *": {
|
||||
borderColor: "gray.300",
|
||||
@@ -5,15 +5,20 @@ import {
|
||||
BsChevronLeft,
|
||||
BsChevronRight,
|
||||
} from "react-icons/bs";
|
||||
import { usePage, useScenarios } from "~/utils/hooks";
|
||||
import { usePage } from "~/utils/hooks";
|
||||
|
||||
const ScenarioPaginator = () => {
|
||||
const Paginator = ({
|
||||
numItemsLoaded,
|
||||
startIndex,
|
||||
lastPage,
|
||||
count,
|
||||
}: {
|
||||
numItemsLoaded: number;
|
||||
startIndex: number;
|
||||
lastPage: number;
|
||||
count: number;
|
||||
}) => {
|
||||
const [page, setPage] = usePage();
|
||||
const { data } = useScenarios();
|
||||
|
||||
if (!data) return null;
|
||||
|
||||
const { scenarios, startIndex, lastPage, count } = data;
|
||||
|
||||
const nextPage = () => {
|
||||
if (page < lastPage) {
|
||||
@@ -49,7 +54,7 @@ const ScenarioPaginator = () => {
|
||||
icon={<BsChevronLeft />}
|
||||
/>
|
||||
<Box>
|
||||
{startIndex}-{startIndex + scenarios.length - 1} / {count}
|
||||
{startIndex}-{startIndex + numItemsLoaded - 1} / {count}
|
||||
</Box>
|
||||
<IconButton
|
||||
variant="ghost"
|
||||
@@ -71,4 +76,4 @@ const ScenarioPaginator = () => {
|
||||
);
|
||||
};
|
||||
|
||||
export default ScenarioPaginator;
|
||||
export default Paginator;
|
||||
@@ -20,7 +20,7 @@ import { useHandledAsyncCallback, useVisibleScenarioIds } from "~/utils/hooks";
|
||||
import { type PromptVariant } from "@prisma/client";
|
||||
import { useState } from "react";
|
||||
import CompareFunctions from "./CompareFunctions";
|
||||
import { CustomInstructionsInput } from "./CustomInstructionsInput";
|
||||
import { CustomInstructionsInput } from "../CustomInstructionsInput";
|
||||
import { RefineAction } from "./RefineAction";
|
||||
import { isObject, isString } from "lodash-es";
|
||||
import { type RefinementAction, type SupportedProvider } from "~/modelProviders/types";
|
||||
@@ -73,7 +73,7 @@ export const RefinePromptModal = ({
|
||||
return;
|
||||
await replaceVariantMutation.mutateAsync({
|
||||
id: variant.id,
|
||||
constructFn: refinedPromptFn,
|
||||
promptConstructor: refinedPromptFn,
|
||||
streamScenarios: visibleScenarios,
|
||||
});
|
||||
await utils.promptVariants.list.invalidate();
|
||||
@@ -97,7 +97,7 @@ export const RefinePromptModal = ({
|
||||
<ModalCloseButton />
|
||||
<ModalBody maxW="unset">
|
||||
<VStack spacing={8}>
|
||||
<VStack spacing={4}>
|
||||
<VStack spacing={4} w="full">
|
||||
{Object.keys(refinementActions).length && (
|
||||
<>
|
||||
<SimpleGrid columns={{ base: 1, md: 2 }} spacing={8}>
|
||||
@@ -122,11 +122,11 @@ export const RefinePromptModal = ({
|
||||
instructions={instructions}
|
||||
setInstructions={setInstructions}
|
||||
loading={modificationInProgress}
|
||||
onSubmit={getModifiedPromptFn}
|
||||
onSubmit={() => getModifiedPromptFn()}
|
||||
/>
|
||||
</VStack>
|
||||
<CompareFunctions
|
||||
originalFunction={variant.constructFn}
|
||||
originalFunction={variant.promptConstructor}
|
||||
newFunction={isString(refinedPromptFn) ? refinedPromptFn : undefined}
|
||||
maxH="40vh"
|
||||
/>
|
||||
110
app/src/components/datasets/DatasetCard.tsx
Normal file
@@ -0,0 +1,110 @@
|
||||
import {
|
||||
HStack,
|
||||
Icon,
|
||||
VStack,
|
||||
Text,
|
||||
Divider,
|
||||
Spinner,
|
||||
AspectRatio,
|
||||
SkeletonText,
|
||||
} from "@chakra-ui/react";
|
||||
import { RiDatabase2Line } from "react-icons/ri";
|
||||
import { formatTimePast } from "~/utils/dayjs";
|
||||
import Link from "next/link";
|
||||
import { useRouter } from "next/router";
|
||||
import { BsPlusSquare } from "react-icons/bs";
|
||||
import { api } from "~/utils/api";
|
||||
import { useHandledAsyncCallback } from "~/utils/hooks";
|
||||
|
||||
type DatasetData = {
|
||||
name: string;
|
||||
numEntries: number;
|
||||
id: string;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
};
|
||||
|
||||
export const DatasetCard = ({ dataset }: { dataset: DatasetData }) => {
|
||||
return (
|
||||
<AspectRatio ratio={1.2} w="full">
|
||||
<VStack
|
||||
as={Link}
|
||||
href={{ pathname: "/data/[id]", query: { id: dataset.id } }}
|
||||
bg="gray.50"
|
||||
_hover={{ bg: "gray.100" }}
|
||||
transition="background 0.2s"
|
||||
cursor="pointer"
|
||||
borderColor="gray.200"
|
||||
borderWidth={1}
|
||||
p={4}
|
||||
justify="space-between"
|
||||
>
|
||||
<HStack w="full" color="gray.700" justify="center">
|
||||
<Icon as={RiDatabase2Line} boxSize={4} />
|
||||
<Text fontWeight="bold">{dataset.name}</Text>
|
||||
</HStack>
|
||||
<HStack h="full" spacing={4} flex={1} align="center">
|
||||
<CountLabel label="Rows" count={dataset.numEntries} />
|
||||
</HStack>
|
||||
<HStack w="full" color="gray.500" fontSize="xs" textAlign="center">
|
||||
<Text flex={1}>Created {formatTimePast(dataset.createdAt)}</Text>
|
||||
<Divider h={4} orientation="vertical" />
|
||||
<Text flex={1}>Updated {formatTimePast(dataset.updatedAt)}</Text>
|
||||
</HStack>
|
||||
</VStack>
|
||||
</AspectRatio>
|
||||
);
|
||||
};
|
||||
|
||||
const CountLabel = ({ label, count }: { label: string; count: number }) => {
|
||||
return (
|
||||
<VStack alignItems="center" flex={1}>
|
||||
<Text color="gray.500" fontWeight="bold">
|
||||
{label}
|
||||
</Text>
|
||||
<Text fontSize="sm" color="gray.500">
|
||||
{count}
|
||||
</Text>
|
||||
</VStack>
|
||||
);
|
||||
};
|
||||
|
||||
export const NewDatasetCard = () => {
|
||||
const router = useRouter();
|
||||
const createMutation = api.datasets.create.useMutation();
|
||||
const [createDataset, isLoading] = useHandledAsyncCallback(async () => {
|
||||
const newDataset = await createMutation.mutateAsync({ label: "New Dataset" });
|
||||
await router.push({ pathname: "/data/[id]", query: { id: newDataset.id } });
|
||||
}, [createMutation, router]);
|
||||
|
||||
return (
|
||||
<AspectRatio ratio={1.2} w="full">
|
||||
<VStack
|
||||
align="center"
|
||||
justify="center"
|
||||
_hover={{ cursor: "pointer", bg: "gray.50" }}
|
||||
transition="background 0.2s"
|
||||
cursor="pointer"
|
||||
borderColor="gray.200"
|
||||
borderWidth={1}
|
||||
p={4}
|
||||
onClick={createDataset}
|
||||
>
|
||||
<Icon as={isLoading ? Spinner : BsPlusSquare} boxSize={8} />
|
||||
<Text display={{ base: "none", md: "block" }} ml={2}>
|
||||
New Dataset
|
||||
</Text>
|
||||
</VStack>
|
||||
</AspectRatio>
|
||||
);
|
||||
};
|
||||
|
||||
export const DatasetCardSkeleton = () => (
|
||||
<AspectRatio ratio={1.2} w="full">
|
||||
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="gray.50">
|
||||
<SkeletonText noOfLines={1} w="80%" />
|
||||
<SkeletonText noOfLines={2} w="60%" />
|
||||
<SkeletonText noOfLines={1} w="80%" />
|
||||
</VStack>
|
||||
</AspectRatio>
|
||||
);
|
||||
21
app/src/components/datasets/DatasetEntriesPaginator.tsx
Normal file
@@ -0,0 +1,21 @@
|
||||
import { useDatasetEntries } from "~/utils/hooks";
|
||||
import Paginator from "../Paginator";
|
||||
|
||||
const DatasetEntriesPaginator = () => {
|
||||
const { data } = useDatasetEntries();
|
||||
|
||||
if (!data) return null;
|
||||
|
||||
const { entries, startIndex, lastPage, count } = data;
|
||||
|
||||
return (
|
||||
<Paginator
|
||||
numItemsLoaded={entries.length}
|
||||
startIndex={startIndex}
|
||||
lastPage={lastPage}
|
||||
count={count}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export default DatasetEntriesPaginator;
|
||||
31
app/src/components/datasets/DatasetEntriesTable.tsx
Normal file
@@ -0,0 +1,31 @@
|
||||
import { type StackProps, VStack, Table, Th, Tr, Thead, Tbody, Text } from "@chakra-ui/react";
|
||||
import { useDatasetEntries } from "~/utils/hooks";
|
||||
import TableRow from "./TableRow";
|
||||
import DatasetEntriesPaginator from "./DatasetEntriesPaginator";
|
||||
|
||||
const DatasetEntriesTable = (props: StackProps) => {
|
||||
const { data } = useDatasetEntries();
|
||||
|
||||
return (
|
||||
<VStack justifyContent="space-between" {...props}>
|
||||
<Table variant="simple" sx={{ "table-layout": "fixed", width: "full" }}>
|
||||
<Thead>
|
||||
<Tr>
|
||||
<Th>Input</Th>
|
||||
<Th>Output</Th>
|
||||
</Tr>
|
||||
</Thead>
|
||||
<Tbody>{data?.entries.map((entry) => <TableRow key={entry.id} entry={entry} />)}</Tbody>
|
||||
</Table>
|
||||
{(!data || data.entries.length) === 0 ? (
|
||||
<Text alignSelf="flex-start" pl={6} color="gray.500">
|
||||
No entries found
|
||||
</Text>
|
||||
) : (
|
||||
<DatasetEntriesPaginator />
|
||||
)}
|
||||
</VStack>
|
||||
);
|
||||
};
|
||||
|
||||
export default DatasetEntriesTable;
|
||||
@@ -0,0 +1,26 @@
|
||||
import { Button, HStack, useDisclosure } from "@chakra-ui/react";
|
||||
import { BiImport } from "react-icons/bi";
|
||||
import { BsStars } from "react-icons/bs";
|
||||
|
||||
import { GenerateDataModal } from "./GenerateDataModal";
|
||||
|
||||
export const DatasetHeaderButtons = () => {
|
||||
const generateModalDisclosure = useDisclosure();
|
||||
|
||||
return (
|
||||
<>
|
||||
<HStack>
|
||||
<Button leftIcon={<BiImport />} colorScheme="blue" variant="ghost">
|
||||
Import Data
|
||||
</Button>
|
||||
<Button leftIcon={<BsStars />} colorScheme="blue" onClick={generateModalDisclosure.onOpen}>
|
||||
Generate Data
|
||||
</Button>
|
||||
</HStack>
|
||||
<GenerateDataModal
|
||||
isOpen={generateModalDisclosure.isOpen}
|
||||
onClose={generateModalDisclosure.onClose}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,128 @@
|
||||
import {
|
||||
Modal,
|
||||
ModalBody,
|
||||
ModalCloseButton,
|
||||
ModalContent,
|
||||
ModalHeader,
|
||||
ModalOverlay,
|
||||
ModalFooter,
|
||||
Text,
|
||||
HStack,
|
||||
VStack,
|
||||
Icon,
|
||||
NumberInput,
|
||||
NumberInputField,
|
||||
NumberInputStepper,
|
||||
NumberIncrementStepper,
|
||||
NumberDecrementStepper,
|
||||
Button,
|
||||
} from "@chakra-ui/react";
|
||||
import { BsStars } from "react-icons/bs";
|
||||
import { useState } from "react";
|
||||
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
|
||||
import { api } from "~/utils/api";
|
||||
import AutoResizeTextArea from "~/components/AutoResizeTextArea";
|
||||
|
||||
export const GenerateDataModal = ({
|
||||
isOpen,
|
||||
onClose,
|
||||
}: {
|
||||
isOpen: boolean;
|
||||
onClose: () => void;
|
||||
}) => {
|
||||
const utils = api.useContext();
|
||||
|
||||
const datasetId = useDataset().data?.id;
|
||||
|
||||
const [numToGenerate, setNumToGenerate] = useState<number>(20);
|
||||
const [inputDescription, setInputDescription] = useState<string>(
|
||||
"Each input should contain an email body. Half of the emails should contain event details, and the other half should not.",
|
||||
);
|
||||
const [outputDescription, setOutputDescription] = useState<string>(
|
||||
`Each output should contain "true" or "false", where "true" indicates that the email contains event details.`,
|
||||
);
|
||||
|
||||
const generateEntriesMutation = api.datasetEntries.autogenerateEntries.useMutation();
|
||||
|
||||
const [generateEntries, generateEntriesInProgress] = useHandledAsyncCallback(async () => {
|
||||
if (!inputDescription || !outputDescription || !numToGenerate || !datasetId) return;
|
||||
await generateEntriesMutation.mutateAsync({
|
||||
datasetId,
|
||||
inputDescription,
|
||||
outputDescription,
|
||||
numToGenerate,
|
||||
});
|
||||
await utils.datasetEntries.list.invalidate();
|
||||
onClose();
|
||||
}, [
|
||||
generateEntriesMutation,
|
||||
onClose,
|
||||
inputDescription,
|
||||
outputDescription,
|
||||
numToGenerate,
|
||||
datasetId,
|
||||
]);
|
||||
|
||||
return (
|
||||
<Modal isOpen={isOpen} onClose={onClose} size={{ base: "xl", sm: "2xl", md: "3xl" }}>
|
||||
<ModalOverlay />
|
||||
<ModalContent w={1200}>
|
||||
<ModalHeader>
|
||||
<HStack>
|
||||
<Icon as={BsStars} />
|
||||
<Text>Generate Data</Text>
|
||||
</HStack>
|
||||
</ModalHeader>
|
||||
<ModalCloseButton />
|
||||
<ModalBody maxW="unset">
|
||||
<VStack w="full" spacing={8} padding={8} alignItems="flex-start">
|
||||
<VStack alignItems="flex-start" spacing={2}>
|
||||
<Text fontWeight="bold">Number of Rows:</Text>
|
||||
<NumberInput
|
||||
step={5}
|
||||
defaultValue={15}
|
||||
min={0}
|
||||
max={100}
|
||||
onChange={(valueString) => setNumToGenerate(parseInt(valueString) || 0)}
|
||||
value={numToGenerate}
|
||||
w="24"
|
||||
>
|
||||
<NumberInputField />
|
||||
<NumberInputStepper>
|
||||
<NumberIncrementStepper />
|
||||
<NumberDecrementStepper />
|
||||
</NumberInputStepper>
|
||||
</NumberInput>
|
||||
</VStack>
|
||||
<VStack alignItems="flex-start" w="full" spacing={2}>
|
||||
<Text fontWeight="bold">Input Description:</Text>
|
||||
<AutoResizeTextArea
|
||||
value={inputDescription}
|
||||
onChange={(e) => setInputDescription(e.target.value)}
|
||||
placeholder="Each input should contain..."
|
||||
/>
|
||||
</VStack>
|
||||
<VStack alignItems="flex-start" w="full" spacing={2}>
|
||||
<Text fontWeight="bold">Output Description (optional):</Text>
|
||||
<AutoResizeTextArea
|
||||
value={outputDescription}
|
||||
onChange={(e) => setOutputDescription(e.target.value)}
|
||||
placeholder="The output should contain..."
|
||||
/>
|
||||
</VStack>
|
||||
</VStack>
|
||||
</ModalBody>
|
||||
<ModalFooter>
|
||||
<Button
|
||||
colorScheme="blue"
|
||||
isLoading={generateEntriesInProgress}
|
||||
isDisabled={!numToGenerate || !inputDescription || !outputDescription}
|
||||
onClick={generateEntries}
|
||||
>
|
||||
Generate
|
||||
</Button>
|
||||
</ModalFooter>
|
||||
</ModalContent>
|
||||
</Modal>
|
||||
);
|
||||
};
|
||||