Compare commits
2 Commits
arcticfly-
...
show-heade
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a1e3036064 | ||
|
|
22a1423690 |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -3,4 +3,3 @@
|
|||||||
*.pyc
|
*.pyc
|
||||||
node_modules/
|
node_modules/
|
||||||
*.tsbuildinfo
|
*.tsbuildinfo
|
||||||
dist/
|
|
||||||
106
README.md
106
README.md
@@ -1,52 +1,16 @@
|
|||||||
<p align="center">
|
<!-- <img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" /> -->
|
||||||
<a href="https://openpipe.ai">
|
|
||||||
<img height="70" src="https://github.com/openpipe/openpipe/assets/41524992/70af25fb-1f90-42d9-8a20-3606e3b5aaba" alt="logo">
|
|
||||||
</a>
|
|
||||||
</p>
|
|
||||||
<h1 align="center">
|
|
||||||
OpenPipe
|
|
||||||
</h1>
|
|
||||||
|
|
||||||
<p align="center">
|
# OpenPipe
|
||||||
<i>Turn expensive prompts into cheap fine-tuned models.</i>
|
|
||||||
</p>
|
|
||||||
|
|
||||||
<p align="center">
|
OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts, and can automatically [translate](#-translate-between-model-apis) those prompts between models.
|
||||||
<a href="/LICENSE"><img alt="License Apache-2.0" src="https://img.shields.io/github/license/openpipe/openpipe?style=flat-square"></a>
|
|
||||||
<a href='http://makeapullrequest.com'><img alt='PRs Welcome' src='https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square'/></a>
|
|
||||||
<a href="https://github.com/openpipe/openpipe/graphs/commit-activity"><img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/m/openpipe/openpipe?style=flat-square"/></a>
|
|
||||||
<a href="https://github.com/openpipe/openpipe/issues"><img alt="GitHub closed issues" src="https://img.shields.io/github/issues-closed/openpipe/openpipe?style=flat-square"/></a>
|
|
||||||
</p>
|
|
||||||
|
|
||||||
<p align="center">
|
<img src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="demo">
|
||||||
<a href="https://app.openpipe.ai/">Hosted App</a> - <a href="#running-locally">Running Locally</a> - <a href="#sample-experiments">Experiments</a>
|
|
||||||
</p>
|
|
||||||
|
|
||||||
<br>
|
|
||||||
Use powerful but expensive LLMs to fine-tune smaller and cheaper models suited to your exact needs. Evaluate model and prompt combinations in the playground. Query your past requests and export optimized training data.
|
|
||||||
<br>
|
|
||||||
|
|
||||||
|
|
||||||
## 🪛 Features
|
|
||||||
|
|
||||||
* <b>Fine-Tune</b>
|
|
||||||
* Easy integration with OpenPipe's SDK in both Python and JS.
|
|
||||||
* Swiftly query logs using intuitive built-in filters.
|
|
||||||
* Export data in multiple training formats, including Alpaca and ChatGPT, with deduplication.
|
|
||||||
|
|
||||||
* <b>Experiment</b>
|
|
||||||
* Bulk-test wide-reaching scenarios using code templating.
|
|
||||||
* Seamlessly translate prompts across different model APIs.
|
|
||||||
* Tap into autogenerated scenarios for fresh test perspectives.
|
|
||||||
|
|
||||||
<img src="https://github.com/openpipe/openpipe/assets/41524992/eaa8b92d-4536-4f63-bbef-4b0b1a60f6b5" alt="fine-tune demo">
|
|
||||||
|
|
||||||
<!-- <img height="400px" src="https://github.com/openpipe/openpipe/assets/41524992/66bb1843-cb72-4130-a369-eec2df3b8201" alt="playground demo"> -->
|
|
||||||
|
|
||||||
|
You can use our hosted version of OpenPipe at https://openpipe.ai. You can also clone this repository and [run it locally](#running-locally).
|
||||||
|
|
||||||
## Sample Experiments
|
## Sample Experiments
|
||||||
|
|
||||||
These are sample experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
|
These are simple experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
|
||||||
|
|
||||||
- [Twitter Sentiment Analysis](https://app.openpipe.ai/experiments/62c20a73-2012-4a64-973c-4b665ad46a57)
|
- [Twitter Sentiment Analysis](https://app.openpipe.ai/experiments/62c20a73-2012-4a64-973c-4b665ad46a57)
|
||||||
- [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222)
|
- [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222)
|
||||||
@@ -55,25 +19,43 @@ These are sample experiments users have created that show how OpenPipe works. Fe
|
|||||||
|
|
||||||
## Supported Models
|
## Supported Models
|
||||||
|
|
||||||
#### OpenAI
|
- All models available through the OpenAI [chat completion API](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
|
||||||
- [GPT 3.5 Turbo](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
|
- Llama2 [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat), [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat), [70b chat](https://replicate.com/replicate/llama70b-v2-chat).
|
||||||
- [GPT 3.5 Turbo 16k](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
|
- Anthropic's [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude) and [Claude 2](https://www.anthropic.com/index/claude-2)
|
||||||
- [GPT 4](https://openai.com/gpt-4)
|
|
||||||
#### Llama2
|
## Features
|
||||||
- [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat)
|
|
||||||
- [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat)
|
### 🔍 Visualize Responses
|
||||||
- [70b chat](https://replicate.com/replicate/llama70b-v2-chat)
|
|
||||||
#### Llama2 Fine-Tunes
|
Inspect prompt completions side-by-side.
|
||||||
- [Open-Orca/OpenOrcaxOpenChat-Preview2-13B](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B)
|
|
||||||
- [Open-Orca/OpenOrca-Platypus2-13B](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B)
|
### 🧪 Bulk-Test
|
||||||
- [NousResearch/Nous-Hermes-Llama2-13b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b)
|
|
||||||
- [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0)
|
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broad coverage of your problem space.
|
||||||
- [lmsys/vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5)
|
|
||||||
- [Gryphe/MythoMax-L2-13b](https://huggingface.co/Gryphe/MythoMax-L2-13b)
|
### 📟 Translate between Model APIs
|
||||||
- [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b)
|
|
||||||
#### Anthropic
|
Write your prompt in one format and automatically convert it to work with any other model.
|
||||||
- [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude)
|
|
||||||
- [Claude 2](https://www.anthropic.com/index/claude-2)
|
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/1e19ccf2-96b6-4e93-a3a5-1449710d1b5b" alt="translate between models">
|
||||||
|
|
||||||
|
<br><br>
|
||||||
|
|
||||||
|
### 🛠️ Refine Your Prompts Automatically
|
||||||
|
|
||||||
|
Use a growing database of best-practice refinements to improve your prompts automatically.
|
||||||
|
|
||||||
|
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/87a27fe7-daef-445c-a5e2-1c82b23f9f99" alt="add function call">
|
||||||
|
|
||||||
|
<br><br>
|
||||||
|
|
||||||
|
### 🪄 Auto-generate Test Scenarios
|
||||||
|
|
||||||
|
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
|
||||||
|
|
||||||
|
<img width="600" src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="auto-generate">
|
||||||
|
|
||||||
|
<br><br>
|
||||||
|
|
||||||
## Running Locally
|
## Running Locally
|
||||||
|
|
||||||
@@ -93,4 +75,4 @@ These are sample experiments users have created that show how OpenPipe works. Fe
|
|||||||
1. Copy your `.env` file to `.env.test`.
|
1. Copy your `.env` file to `.env.test`.
|
||||||
2. Update the `DATABASE_URL` to have a different database name than your development one
|
2. Update the `DATABASE_URL` to have a different database name than your development one
|
||||||
3. Run `DATABASE_URL=[your new datatase url] pnpm prisma migrate dev --skip-seed --skip-generate`
|
3. Run `DATABASE_URL=[your new datatase url] pnpm prisma migrate dev --skip-seed --skip-generate`
|
||||||
4. Run `pnpm test`
|
4. Run `pnpm test`
|
||||||
3
app/@types/nextjs-routes.d.ts
vendored
3
app/@types/nextjs-routes.d.ts
vendored
@@ -19,9 +19,10 @@ declare module "nextjs-routes" {
|
|||||||
| DynamicRoute<"/api/v1/[...trpc]", { "trpc": string[] }>
|
| DynamicRoute<"/api/v1/[...trpc]", { "trpc": string[] }>
|
||||||
| StaticRoute<"/api/v1/openapi">
|
| StaticRoute<"/api/v1/openapi">
|
||||||
| StaticRoute<"/dashboard">
|
| StaticRoute<"/dashboard">
|
||||||
|
| DynamicRoute<"/data/[id]", { "id": string }>
|
||||||
|
| StaticRoute<"/data">
|
||||||
| DynamicRoute<"/experiments/[experimentSlug]", { "experimentSlug": string }>
|
| DynamicRoute<"/experiments/[experimentSlug]", { "experimentSlug": string }>
|
||||||
| StaticRoute<"/experiments">
|
| StaticRoute<"/experiments">
|
||||||
| StaticRoute<"/fine-tunes">
|
|
||||||
| StaticRoute<"/">
|
| StaticRoute<"/">
|
||||||
| DynamicRoute<"/invitations/[invitationToken]", { "invitationToken": string }>
|
| DynamicRoute<"/invitations/[invitationToken]", { "invitationToken": string }>
|
||||||
| StaticRoute<"/project/settings">
|
| StaticRoute<"/project/settings">
|
||||||
|
|||||||
@@ -23,6 +23,7 @@ ARG NEXT_PUBLIC_SOCKET_URL
|
|||||||
ARG NEXT_PUBLIC_HOST
|
ARG NEXT_PUBLIC_HOST
|
||||||
ARG NEXT_PUBLIC_SENTRY_DSN
|
ARG NEXT_PUBLIC_SENTRY_DSN
|
||||||
ARG SENTRY_AUTH_TOKEN
|
ARG SENTRY_AUTH_TOKEN
|
||||||
|
ARG NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS
|
||||||
|
|
||||||
WORKDIR /code
|
WORKDIR /code
|
||||||
COPY --from=deps /code/node_modules ./node_modules
|
COPY --from=deps /code/node_modules ./node_modules
|
||||||
@@ -44,4 +45,4 @@ EXPOSE 3000
|
|||||||
ENV PORT 3000
|
ENV PORT 3000
|
||||||
|
|
||||||
# Run the "run-prod.sh" script
|
# Run the "run-prod.sh" script
|
||||||
CMD /code/app/scripts/run-prod.sh
|
CMD /code/app/run-prod.sh
|
||||||
@@ -12,8 +12,8 @@
|
|||||||
"build": "next build",
|
"build": "next build",
|
||||||
"dev:next": "TZ=UTC next dev",
|
"dev:next": "TZ=UTC next dev",
|
||||||
"dev:wss": "pnpm tsx --watch src/wss-server.ts",
|
"dev:wss": "pnpm tsx --watch src/wss-server.ts",
|
||||||
"worker": "NODE_ENV='development' pnpm tsx --watch src/server/tasks/worker.ts",
|
"dev:worker": "NODE_ENV='development' pnpm tsx --watch src/server/tasks/worker.ts",
|
||||||
"dev": "concurrently --kill-others 'pnpm dev:next' 'pnpm dev:wss' 'pnpm worker --watch'",
|
"dev": "concurrently --kill-others 'pnpm dev:next' 'pnpm dev:wss' 'pnpm dev:worker'",
|
||||||
"postinstall": "prisma generate",
|
"postinstall": "prisma generate",
|
||||||
"lint": "next lint",
|
"lint": "next lint",
|
||||||
"start": "TZ=UTC next start",
|
"start": "TZ=UTC next start",
|
||||||
@@ -48,7 +48,6 @@
|
|||||||
"@trpc/react-query": "^10.26.0",
|
"@trpc/react-query": "^10.26.0",
|
||||||
"@trpc/server": "^10.26.0",
|
"@trpc/server": "^10.26.0",
|
||||||
"@vercel/og": "^0.5.9",
|
"@vercel/og": "^0.5.9",
|
||||||
"archiver": "^6.0.0",
|
|
||||||
"ast-types": "^0.14.2",
|
"ast-types": "^0.14.2",
|
||||||
"chroma-js": "^2.4.2",
|
"chroma-js": "^2.4.2",
|
||||||
"concurrently": "^8.2.0",
|
"concurrently": "^8.2.0",
|
||||||
@@ -61,7 +60,6 @@
|
|||||||
"framer-motion": "^10.12.17",
|
"framer-motion": "^10.12.17",
|
||||||
"gpt-tokens": "^1.0.10",
|
"gpt-tokens": "^1.0.10",
|
||||||
"graphile-worker": "^0.13.0",
|
"graphile-worker": "^0.13.0",
|
||||||
"human-id": "^4.0.0",
|
|
||||||
"immer": "^10.0.2",
|
"immer": "^10.0.2",
|
||||||
"isolated-vm": "^4.5.0",
|
"isolated-vm": "^4.5.0",
|
||||||
"json-schema-to-typescript": "^13.0.2",
|
"json-schema-to-typescript": "^13.0.2",
|
||||||
@@ -100,7 +98,6 @@
|
|||||||
"replicate": "^0.12.3",
|
"replicate": "^0.12.3",
|
||||||
"socket.io": "^4.7.1",
|
"socket.io": "^4.7.1",
|
||||||
"socket.io-client": "^4.7.1",
|
"socket.io-client": "^4.7.1",
|
||||||
"stream-buffers": "^3.0.2",
|
|
||||||
"superjson": "1.12.2",
|
"superjson": "1.12.2",
|
||||||
"trpc-openapi": "^1.2.0",
|
"trpc-openapi": "^1.2.0",
|
||||||
"tsx": "^3.12.7",
|
"tsx": "^3.12.7",
|
||||||
@@ -113,7 +110,6 @@
|
|||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@openapi-contrib/openapi-schema-to-json-schema": "^4.0.5",
|
"@openapi-contrib/openapi-schema-to-json-schema": "^4.0.5",
|
||||||
"@types/archiver": "^5.3.2",
|
|
||||||
"@types/babel__core": "^7.20.1",
|
"@types/babel__core": "^7.20.1",
|
||||||
"@types/babel__standalone": "^7.1.4",
|
"@types/babel__standalone": "^7.1.4",
|
||||||
"@types/chroma-js": "^2.4.0",
|
"@types/chroma-js": "^2.4.0",
|
||||||
@@ -130,7 +126,6 @@
|
|||||||
"@types/react": "^18.2.6",
|
"@types/react": "^18.2.6",
|
||||||
"@types/react-dom": "^18.2.4",
|
"@types/react-dom": "^18.2.4",
|
||||||
"@types/react-syntax-highlighter": "^15.5.7",
|
"@types/react-syntax-highlighter": "^15.5.7",
|
||||||
"@types/stream-buffers": "^3.0.4",
|
|
||||||
"@types/uuid": "^9.0.2",
|
"@types/uuid": "^9.0.2",
|
||||||
"@typescript-eslint/eslint-plugin": "^5.59.6",
|
"@typescript-eslint/eslint-plugin": "^5.59.6",
|
||||||
"@typescript-eslint/parser": "^5.59.6",
|
"@typescript-eslint/parser": "^5.59.6",
|
||||||
|
|||||||
@@ -1,48 +0,0 @@
|
|||||||
/*
|
|
||||||
Warnings:
|
|
||||||
|
|
||||||
- You are about to drop the column `input` on the `DatasetEntry` table. All the data in the column will be lost.
|
|
||||||
- You are about to drop the column `output` on the `DatasetEntry` table. All the data in the column will be lost.
|
|
||||||
- Added the required column `loggedCallId` to the `DatasetEntry` table without a default value. This is not possible if the table is not empty.
|
|
||||||
|
|
||||||
*/
|
|
||||||
-- AlterTable
|
|
||||||
ALTER TABLE "DatasetEntry" DROP COLUMN "input",
|
|
||||||
DROP COLUMN "output",
|
|
||||||
ADD COLUMN "loggedCallId" UUID NOT NULL;
|
|
||||||
|
|
||||||
-- AddForeignKey
|
|
||||||
ALTER TABLE "DatasetEntry" ADD CONSTRAINT "DatasetEntry_loggedCallId_fkey" FOREIGN KEY ("loggedCallId") REFERENCES "LoggedCall"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
|
||||||
|
|
||||||
-- AlterTable
|
|
||||||
ALTER TABLE "LoggedCallModelResponse" ALTER COLUMN "cost" SET DATA TYPE DOUBLE PRECISION;
|
|
||||||
|
|
||||||
-- CreateEnum
|
|
||||||
CREATE TYPE "FineTuneStatus" AS ENUM ('PENDING', 'TRAINING', 'AWAITING_DEPLOYMENT', 'DEPLOYING', 'DEPLOYED', 'ERROR');
|
|
||||||
|
|
||||||
-- CreateTable
|
|
||||||
CREATE TABLE "FineTune" (
|
|
||||||
"id" UUID NOT NULL,
|
|
||||||
"slug" TEXT NOT NULL,
|
|
||||||
"baseModel" TEXT NOT NULL,
|
|
||||||
"status" "FineTuneStatus" NOT NULL DEFAULT 'PENDING',
|
|
||||||
"trainingStartedAt" TIMESTAMP(3),
|
|
||||||
"trainingFinishedAt" TIMESTAMP(3),
|
|
||||||
"deploymentStartedAt" TIMESTAMP(3),
|
|
||||||
"deploymentFinishedAt" TIMESTAMP(3),
|
|
||||||
"datasetId" UUID NOT NULL,
|
|
||||||
"projectId" UUID NOT NULL,
|
|
||||||
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
"updatedAt" TIMESTAMP(3) NOT NULL,
|
|
||||||
|
|
||||||
CONSTRAINT "FineTune_pkey" PRIMARY KEY ("id")
|
|
||||||
);
|
|
||||||
|
|
||||||
-- CreateIndex
|
|
||||||
CREATE UNIQUE INDEX "FineTune_slug_key" ON "FineTune"("slug");
|
|
||||||
|
|
||||||
-- AddForeignKey
|
|
||||||
ALTER TABLE "FineTune" ADD CONSTRAINT "FineTune_datasetId_fkey" FOREIGN KEY ("datasetId") REFERENCES "Dataset"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
|
||||||
|
|
||||||
-- AddForeignKey
|
|
||||||
ALTER TABLE "FineTune" ADD CONSTRAINT "FineTune_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
|
||||||
@@ -181,7 +181,6 @@ model Dataset {
|
|||||||
|
|
||||||
name String
|
name String
|
||||||
datasetEntries DatasetEntry[]
|
datasetEntries DatasetEntry[]
|
||||||
fineTunes FineTune[]
|
|
||||||
|
|
||||||
projectId String @db.Uuid
|
projectId String @db.Uuid
|
||||||
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||||
@@ -193,8 +192,8 @@ model Dataset {
|
|||||||
model DatasetEntry {
|
model DatasetEntry {
|
||||||
id String @id @default(uuid()) @db.Uuid
|
id String @id @default(uuid()) @db.Uuid
|
||||||
|
|
||||||
loggedCallId String @db.Uuid
|
input String
|
||||||
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
|
output String?
|
||||||
|
|
||||||
datasetId String @db.Uuid
|
datasetId String @db.Uuid
|
||||||
dataset Dataset? @relation(fields: [datasetId], references: [id], onDelete: Cascade)
|
dataset Dataset? @relation(fields: [datasetId], references: [id], onDelete: Cascade)
|
||||||
@@ -217,7 +216,6 @@ model Project {
|
|||||||
experiments Experiment[]
|
experiments Experiment[]
|
||||||
datasets Dataset[]
|
datasets Dataset[]
|
||||||
loggedCalls LoggedCall[]
|
loggedCalls LoggedCall[]
|
||||||
fineTunes FineTune[]
|
|
||||||
apiKeys ApiKey[]
|
apiKeys ApiKey[]
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -278,9 +276,8 @@ model LoggedCall {
|
|||||||
projectId String @db.Uuid
|
projectId String @db.Uuid
|
||||||
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
||||||
|
|
||||||
model String?
|
model String?
|
||||||
tags LoggedCallTag[]
|
tags LoggedCallTag[]
|
||||||
datasetEntries DatasetEntry[]
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
createdAt DateTime @default(now())
|
||||||
updatedAt DateTime @updatedAt
|
updatedAt DateTime @updatedAt
|
||||||
@@ -315,7 +312,7 @@ model LoggedCallModelResponse {
|
|||||||
outputTokens Int?
|
outputTokens Int?
|
||||||
finishReason String?
|
finishReason String?
|
||||||
completionId String?
|
completionId String?
|
||||||
cost Float?
|
cost Decimal? @db.Decimal(18, 12)
|
||||||
|
|
||||||
// The LoggedCall that created this LoggedCallModelResponse
|
// The LoggedCall that created this LoggedCallModelResponse
|
||||||
originalLoggedCallId String @unique @db.Uuid
|
originalLoggedCallId String @unique @db.Uuid
|
||||||
@@ -430,33 +427,3 @@ model VerificationToken {
|
|||||||
|
|
||||||
@@unique([identifier, token])
|
@@unique([identifier, token])
|
||||||
}
|
}
|
||||||
|
|
||||||
enum FineTuneStatus {
|
|
||||||
PENDING
|
|
||||||
TRAINING
|
|
||||||
AWAITING_DEPLOYMENT
|
|
||||||
DEPLOYING
|
|
||||||
DEPLOYED
|
|
||||||
ERROR
|
|
||||||
}
|
|
||||||
|
|
||||||
model FineTune {
|
|
||||||
id String @id @default(uuid()) @db.Uuid
|
|
||||||
|
|
||||||
slug String @unique
|
|
||||||
baseModel String
|
|
||||||
status FineTuneStatus @default(PENDING)
|
|
||||||
trainingStartedAt DateTime?
|
|
||||||
trainingFinishedAt DateTime?
|
|
||||||
deploymentStartedAt DateTime?
|
|
||||||
deploymentFinishedAt DateTime?
|
|
||||||
|
|
||||||
datasetId String @db.Uuid
|
|
||||||
dataset Dataset @relation(fields: [datasetId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
projectId String @db.Uuid
|
|
||||||
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
|
|
||||||
|
|
||||||
createdAt DateTime @default(now())
|
|
||||||
updatedAt DateTime @updatedAt
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -10,4 +10,6 @@ pnpm tsx src/promptConstructor/migrate.ts
|
|||||||
|
|
||||||
echo "Starting the server"
|
echo "Starting the server"
|
||||||
|
|
||||||
pnpm start
|
pnpm concurrently --kill-others \
|
||||||
|
"pnpm start" \
|
||||||
|
"pnpm tsx src/server/tasks/worker.ts"
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
#! /bin/bash
|
|
||||||
|
|
||||||
set -e
|
|
||||||
cd "$(dirname "$0")/.."
|
|
||||||
apt-get update
|
|
||||||
apt-get install -y htop psql
|
|
||||||
@@ -1,10 +0,0 @@
|
|||||||
#! /bin/bash
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
echo "Migrating the database"
|
|
||||||
pnpm prisma migrate deploy
|
|
||||||
|
|
||||||
echo "Starting 4 workers"
|
|
||||||
|
|
||||||
pnpm concurrently "pnpm worker" "pnpm worker" "pnpm worker" "pnpm worker"
|
|
||||||
@@ -1,13 +0,0 @@
|
|||||||
#! /bin/bash
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
cd "$(dirname "$0")/../.."
|
|
||||||
|
|
||||||
echo "Env is"
|
|
||||||
echo $ENVIRONMENT
|
|
||||||
|
|
||||||
docker build . --file app/Dockerfile --tag "openpipe-prod"
|
|
||||||
|
|
||||||
# Run the image
|
|
||||||
docker run --env-file app/.env -it --entrypoint "/bin/bash" "openpipe-prod"
|
|
||||||
@@ -3,7 +3,6 @@
|
|||||||
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
|
||||||
|
|
||||||
import * as Sentry from "@sentry/nextjs";
|
import * as Sentry from "@sentry/nextjs";
|
||||||
import { isError } from "lodash-es";
|
|
||||||
import { env } from "~/env.mjs";
|
import { env } from "~/env.mjs";
|
||||||
|
|
||||||
if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
||||||
@@ -16,10 +15,4 @@ if (env.NEXT_PUBLIC_SENTRY_DSN) {
|
|||||||
// Setting this option to true will print useful information to the console while you're setting up Sentry.
|
// Setting this option to true will print useful information to the console while you're setting up Sentry.
|
||||||
debug: false,
|
debug: false,
|
||||||
});
|
});
|
||||||
} else {
|
|
||||||
// Install local debug exception handler for rejected promises
|
|
||||||
process.on("unhandledRejection", (reason) => {
|
|
||||||
const reasonDetails = isError(reason) ? reason?.stack : reason;
|
|
||||||
console.log("Unhandled Rejection at:", reasonDetails);
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,14 +0,0 @@
|
|||||||
import { Tooltip, Icon, VStack } from "@chakra-ui/react";
|
|
||||||
import { RiInformationFill } from "react-icons/ri";
|
|
||||||
|
|
||||||
const InfoCircle = ({ tooltipText }: { tooltipText: string }) => {
|
|
||||||
return (
|
|
||||||
<Tooltip label={tooltipText} fontSize="sm" shouldWrapChildren maxW={80}>
|
|
||||||
<VStack>
|
|
||||||
<Icon as={RiInformationFill} boxSize={5} color="gray.500" />
|
|
||||||
</VStack>
|
|
||||||
</Tooltip>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default InfoCircle;
|
|
||||||
@@ -11,7 +11,6 @@ import {
|
|||||||
Button,
|
Button,
|
||||||
Text,
|
Text,
|
||||||
useDisclosure,
|
useDisclosure,
|
||||||
type InputGroupProps,
|
|
||||||
} from "@chakra-ui/react";
|
} from "@chakra-ui/react";
|
||||||
|
|
||||||
import { FiChevronDown } from "react-icons/fi";
|
import { FiChevronDown } from "react-icons/fi";
|
||||||
@@ -21,25 +20,15 @@ type InputDropdownProps<T> = {
|
|||||||
options: ReadonlyArray<T>;
|
options: ReadonlyArray<T>;
|
||||||
selectedOption: T;
|
selectedOption: T;
|
||||||
onSelect: (option: T) => void;
|
onSelect: (option: T) => void;
|
||||||
inputGroupProps?: InputGroupProps;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const InputDropdown = <T,>({
|
const InputDropdown = <T,>({ options, selectedOption, onSelect }: InputDropdownProps<T>) => {
|
||||||
options,
|
|
||||||
selectedOption,
|
|
||||||
onSelect,
|
|
||||||
inputGroupProps,
|
|
||||||
}: InputDropdownProps<T>) => {
|
|
||||||
const popover = useDisclosure();
|
const popover = useDisclosure();
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Popover placement="bottom-start" {...popover}>
|
<Popover placement="bottom-start" {...popover}>
|
||||||
<PopoverTrigger>
|
<PopoverTrigger>
|
||||||
<InputGroup
|
<InputGroup cursor="pointer" w={(selectedOption as string).length * 14 + 180}>
|
||||||
cursor="pointer"
|
|
||||||
w={(selectedOption as string).length * 14 + 180}
|
|
||||||
{...inputGroupProps}
|
|
||||||
>
|
|
||||||
<Input
|
<Input
|
||||||
value={selectedOption as string}
|
value={selectedOption as string}
|
||||||
// eslint-disable-next-line @typescript-eslint/no-empty-function -- controlled input requires onChange
|
// eslint-disable-next-line @typescript-eslint/no-empty-function -- controlled input requires onChange
|
||||||
|
|||||||
@@ -43,7 +43,7 @@ export default function OutputCell({
|
|||||||
|
|
||||||
type OutputSchema = Parameters<typeof provider.normalizeOutput>[0];
|
type OutputSchema = Parameters<typeof provider.normalizeOutput>[0];
|
||||||
|
|
||||||
const { mutateAsync: hardRefetchMutate } = api.scenarioVariantCells.hardRefetch.useMutation();
|
const { mutateAsync: hardRefetchMutate } = api.scenarioVariantCells.forceRefetch.useMutation();
|
||||||
const [hardRefetch, hardRefetching] = useHandledAsyncCallback(async () => {
|
const [hardRefetch, hardRefetching] = useHandledAsyncCallback(async () => {
|
||||||
await hardRefetchMutate({ scenarioId: scenario.id, variantId: variant.id });
|
await hardRefetchMutate({ scenarioId: scenario.id, variantId: variant.id });
|
||||||
await utils.scenarioVariantCells.get.invalidate({
|
await utils.scenarioVariantCells.get.invalidate({
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
|
|||||||
setIsChanged(false);
|
setIsChanged(false);
|
||||||
|
|
||||||
await utils.promptVariants.list.invalidate();
|
await utils.promptVariants.list.invalidate();
|
||||||
}, [checkForChanges, replaceVariant.mutateAsync]);
|
}, [checkForChanges]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (monaco) {
|
if (monaco) {
|
||||||
|
|||||||
@@ -77,7 +77,6 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
|
|||||||
{...sharedProps}
|
{...sharedProps}
|
||||||
borderBottomLeftRadius={isFirst ? 8 : 0}
|
borderBottomLeftRadius={isFirst ? 8 : 0}
|
||||||
borderBottomRightRadius={isLast ? 8 : 0}
|
borderBottomRightRadius={isLast ? 8 : 0}
|
||||||
boxShadow="5px 5px 15px 1px rgba(0, 0, 0, 0.1);"
|
|
||||||
>
|
>
|
||||||
<VariantStats variant={variant} />
|
<VariantStats variant={variant} />
|
||||||
</GridItem>
|
</GridItem>
|
||||||
|
|||||||
@@ -1,19 +1,15 @@
|
|||||||
import {
|
import { HStack, IconButton, Text, Select, type StackProps, Icon } from "@chakra-ui/react";
|
||||||
HStack,
|
|
||||||
IconButton,
|
|
||||||
Text,
|
|
||||||
Select,
|
|
||||||
type StackProps,
|
|
||||||
Icon,
|
|
||||||
useBreakpointValue,
|
|
||||||
} from "@chakra-ui/react";
|
|
||||||
import React, { useCallback } from "react";
|
import React, { useCallback } from "react";
|
||||||
import { FiChevronsLeft, FiChevronsRight, FiChevronLeft, FiChevronRight } from "react-icons/fi";
|
import { FiChevronsLeft, FiChevronsRight, FiChevronLeft, FiChevronRight } from "react-icons/fi";
|
||||||
import { usePageParams } from "~/utils/hooks";
|
import { usePageParams } from "~/utils/hooks";
|
||||||
|
|
||||||
const pageSizeOptions = [10, 25, 50, 100];
|
const pageSizeOptions = [10, 25, 50, 100];
|
||||||
|
|
||||||
const Paginator = ({ count, ...props }: { count: number; condense?: boolean } & StackProps) => {
|
const Paginator = ({
|
||||||
|
count,
|
||||||
|
condense,
|
||||||
|
...props
|
||||||
|
}: { count: number; condense?: boolean } & StackProps) => {
|
||||||
const { page, pageSize, setPageParams } = usePageParams();
|
const { page, pageSize, setPageParams } = usePageParams();
|
||||||
|
|
||||||
const lastPage = Math.ceil(count / pageSize);
|
const lastPage = Math.ceil(count / pageSize);
|
||||||
@@ -41,9 +37,6 @@ const Paginator = ({ count, ...props }: { count: number; condense?: boolean } &
|
|||||||
const goToLastPage = () => setPageParams({ page: lastPage }, "replace");
|
const goToLastPage = () => setPageParams({ page: lastPage }, "replace");
|
||||||
const goToFirstPage = () => setPageParams({ page: 1 }, "replace");
|
const goToFirstPage = () => setPageParams({ page: 1 }, "replace");
|
||||||
|
|
||||||
const isMobile = useBreakpointValue({ base: true, md: false });
|
|
||||||
const condense = isMobile || props.condense;
|
|
||||||
|
|
||||||
if (count === 0) return null;
|
if (count === 0) return null;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
|
|||||||
112
app/src/components/datasets/DatasetCard.tsx
Normal file
112
app/src/components/datasets/DatasetCard.tsx
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
import {
|
||||||
|
HStack,
|
||||||
|
Icon,
|
||||||
|
VStack,
|
||||||
|
Text,
|
||||||
|
Divider,
|
||||||
|
Spinner,
|
||||||
|
AspectRatio,
|
||||||
|
SkeletonText,
|
||||||
|
} from "@chakra-ui/react";
|
||||||
|
import { RiDatabase2Line } from "react-icons/ri";
|
||||||
|
import { formatTimePast } from "~/utils/dayjs";
|
||||||
|
import Link from "next/link";
|
||||||
|
import { useRouter } from "next/router";
|
||||||
|
import { BsPlusSquare } from "react-icons/bs";
|
||||||
|
import { api } from "~/utils/api";
|
||||||
|
import { useHandledAsyncCallback } from "~/utils/hooks";
|
||||||
|
import { useAppStore } from "~/state/store";
|
||||||
|
|
||||||
|
type DatasetData = {
|
||||||
|
name: string;
|
||||||
|
numEntries: number;
|
||||||
|
id: string;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const DatasetCard = ({ dataset }: { dataset: DatasetData }) => {
|
||||||
|
return (
|
||||||
|
<AspectRatio ratio={1.2} w="full">
|
||||||
|
<VStack
|
||||||
|
as={Link}
|
||||||
|
href={{ pathname: "/data/[id]", query: { id: dataset.id } }}
|
||||||
|
bg="gray.50"
|
||||||
|
_hover={{ bg: "gray.100" }}
|
||||||
|
transition="background 0.2s"
|
||||||
|
cursor="pointer"
|
||||||
|
borderColor="gray.200"
|
||||||
|
borderWidth={1}
|
||||||
|
p={4}
|
||||||
|
justify="space-between"
|
||||||
|
>
|
||||||
|
<HStack w="full" color="gray.700" justify="center">
|
||||||
|
<Icon as={RiDatabase2Line} boxSize={4} />
|
||||||
|
<Text fontWeight="bold">{dataset.name}</Text>
|
||||||
|
</HStack>
|
||||||
|
<HStack h="full" spacing={4} flex={1} align="center">
|
||||||
|
<CountLabel label="Rows" count={dataset.numEntries} />
|
||||||
|
</HStack>
|
||||||
|
<HStack w="full" color="gray.500" fontSize="xs" textAlign="center">
|
||||||
|
<Text flex={1}>Created {formatTimePast(dataset.createdAt)}</Text>
|
||||||
|
<Divider h={4} orientation="vertical" />
|
||||||
|
<Text flex={1}>Updated {formatTimePast(dataset.updatedAt)}</Text>
|
||||||
|
</HStack>
|
||||||
|
</VStack>
|
||||||
|
</AspectRatio>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const CountLabel = ({ label, count }: { label: string; count: number }) => {
|
||||||
|
return (
|
||||||
|
<VStack alignItems="center" flex={1}>
|
||||||
|
<Text color="gray.500" fontWeight="bold">
|
||||||
|
{label}
|
||||||
|
</Text>
|
||||||
|
<Text fontSize="sm" color="gray.500">
|
||||||
|
{count}
|
||||||
|
</Text>
|
||||||
|
</VStack>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export const NewDatasetCard = () => {
|
||||||
|
const router = useRouter();
|
||||||
|
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
|
||||||
|
const createMutation = api.datasets.create.useMutation();
|
||||||
|
const [createDataset, isLoading] = useHandledAsyncCallback(async () => {
|
||||||
|
const newDataset = await createMutation.mutateAsync({ projectId: selectedProjectId ?? "" });
|
||||||
|
await router.push({ pathname: "/data/[id]", query: { id: newDataset.id } });
|
||||||
|
}, [createMutation, router, selectedProjectId]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<AspectRatio ratio={1.2} w="full">
|
||||||
|
<VStack
|
||||||
|
align="center"
|
||||||
|
justify="center"
|
||||||
|
_hover={{ cursor: "pointer", bg: "gray.50" }}
|
||||||
|
transition="background 0.2s"
|
||||||
|
cursor="pointer"
|
||||||
|
borderColor="gray.200"
|
||||||
|
borderWidth={1}
|
||||||
|
p={4}
|
||||||
|
onClick={createDataset}
|
||||||
|
>
|
||||||
|
<Icon as={isLoading ? Spinner : BsPlusSquare} boxSize={8} />
|
||||||
|
<Text display={{ base: "none", md: "block" }} ml={2}>
|
||||||
|
New Dataset
|
||||||
|
</Text>
|
||||||
|
</VStack>
|
||||||
|
</AspectRatio>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export const DatasetCardSkeleton = () => (
|
||||||
|
<AspectRatio ratio={1.2} w="full">
|
||||||
|
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="gray.50">
|
||||||
|
<SkeletonText noOfLines={1} w="80%" />
|
||||||
|
<SkeletonText noOfLines={2} w="60%" />
|
||||||
|
<SkeletonText noOfLines={1} w="80%" />
|
||||||
|
</VStack>
|
||||||
|
</AspectRatio>
|
||||||
|
);
|
||||||
16
app/src/components/datasets/DatasetEntriesPaginator.tsx
Normal file
16
app/src/components/datasets/DatasetEntriesPaginator.tsx
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
import { type StackProps } from "@chakra-ui/react";
|
||||||
|
|
||||||
|
import { useDatasetEntries } from "~/utils/hooks";
|
||||||
|
import Paginator from "../Paginator";
|
||||||
|
|
||||||
|
const DatasetEntriesPaginator = (props: StackProps) => {
|
||||||
|
const { data } = useDatasetEntries();
|
||||||
|
|
||||||
|
if (!data) return null;
|
||||||
|
|
||||||
|
const { count } = data;
|
||||||
|
|
||||||
|
return <Paginator count={count} {...props} />;
|
||||||
|
};
|
||||||
|
|
||||||
|
export default DatasetEntriesPaginator;
|
||||||
31
app/src/components/datasets/DatasetEntriesTable.tsx
Normal file
31
app/src/components/datasets/DatasetEntriesTable.tsx
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
import { type StackProps, VStack, Table, Th, Tr, Thead, Tbody, Text } from "@chakra-ui/react";
|
||||||
|
import { useDatasetEntries } from "~/utils/hooks";
|
||||||
|
import TableRow from "./TableRow";
|
||||||
|
import DatasetEntriesPaginator from "./DatasetEntriesPaginator";
|
||||||
|
|
||||||
|
const DatasetEntriesTable = (props: StackProps) => {
|
||||||
|
const { data } = useDatasetEntries();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<VStack justifyContent="space-between" {...props}>
|
||||||
|
<Table variant="simple" sx={{ "table-layout": "fixed", width: "full" }}>
|
||||||
|
<Thead>
|
||||||
|
<Tr>
|
||||||
|
<Th>Input</Th>
|
||||||
|
<Th>Output</Th>
|
||||||
|
</Tr>
|
||||||
|
</Thead>
|
||||||
|
<Tbody>{data?.entries.map((entry) => <TableRow key={entry.id} entry={entry} />)}</Tbody>
|
||||||
|
</Table>
|
||||||
|
{(!data || data.entries.length) === 0 ? (
|
||||||
|
<Text alignSelf="flex-start" pl={6} color="gray.500">
|
||||||
|
No entries found
|
||||||
|
</Text>
|
||||||
|
) : (
|
||||||
|
<DatasetEntriesPaginator />
|
||||||
|
)}
|
||||||
|
</VStack>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default DatasetEntriesTable;
|
||||||
@@ -0,0 +1,26 @@
|
|||||||
|
import { Button, HStack, useDisclosure } from "@chakra-ui/react";
|
||||||
|
import { BiImport } from "react-icons/bi";
|
||||||
|
import { BsStars } from "react-icons/bs";
|
||||||
|
|
||||||
|
import { GenerateDataModal } from "./GenerateDataModal";
|
||||||
|
|
||||||
|
export const DatasetHeaderButtons = () => {
|
||||||
|
const generateModalDisclosure = useDisclosure();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<HStack>
|
||||||
|
<Button leftIcon={<BiImport />} colorScheme="blue" variant="ghost">
|
||||||
|
Import Data
|
||||||
|
</Button>
|
||||||
|
<Button leftIcon={<BsStars />} colorScheme="blue" onClick={generateModalDisclosure.onOpen}>
|
||||||
|
Generate Data
|
||||||
|
</Button>
|
||||||
|
</HStack>
|
||||||
|
<GenerateDataModal
|
||||||
|
isOpen={generateModalDisclosure.isOpen}
|
||||||
|
onClose={generateModalDisclosure.onClose}
|
||||||
|
/>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -0,0 +1,128 @@
|
|||||||
|
import {
|
||||||
|
Modal,
|
||||||
|
ModalBody,
|
||||||
|
ModalCloseButton,
|
||||||
|
ModalContent,
|
||||||
|
ModalHeader,
|
||||||
|
ModalOverlay,
|
||||||
|
ModalFooter,
|
||||||
|
Text,
|
||||||
|
HStack,
|
||||||
|
VStack,
|
||||||
|
Icon,
|
||||||
|
NumberInput,
|
||||||
|
NumberInputField,
|
||||||
|
NumberInputStepper,
|
||||||
|
NumberIncrementStepper,
|
||||||
|
NumberDecrementStepper,
|
||||||
|
Button,
|
||||||
|
} from "@chakra-ui/react";
|
||||||
|
import { BsStars } from "react-icons/bs";
|
||||||
|
import { useState } from "react";
|
||||||
|
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
|
||||||
|
import { api } from "~/utils/api";
|
||||||
|
import AutoResizeTextArea from "~/components/AutoResizeTextArea";
|
||||||
|
|
||||||
|
export const GenerateDataModal = ({
|
||||||
|
isOpen,
|
||||||
|
onClose,
|
||||||
|
}: {
|
||||||
|
isOpen: boolean;
|
||||||
|
onClose: () => void;
|
||||||
|
}) => {
|
||||||
|
const utils = api.useContext();
|
||||||
|
|
||||||
|
const datasetId = useDataset().data?.id;
|
||||||
|
|
||||||
|
const [numToGenerate, setNumToGenerate] = useState<number>(20);
|
||||||
|
const [inputDescription, setInputDescription] = useState<string>(
|
||||||
|
"Each input should contain an email body. Half of the emails should contain event details, and the other half should not.",
|
||||||
|
);
|
||||||
|
const [outputDescription, setOutputDescription] = useState<string>(
|
||||||
|
`Each output should contain "true" or "false", where "true" indicates that the email contains event details.`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const generateEntriesMutation = api.datasetEntries.autogenerateEntries.useMutation();
|
||||||
|
|
||||||
|
const [generateEntries, generateEntriesInProgress] = useHandledAsyncCallback(async () => {
|
||||||
|
if (!inputDescription || !outputDescription || !numToGenerate || !datasetId) return;
|
||||||
|
await generateEntriesMutation.mutateAsync({
|
||||||
|
datasetId,
|
||||||
|
inputDescription,
|
||||||
|
outputDescription,
|
||||||
|
numToGenerate,
|
||||||
|
});
|
||||||
|
await utils.datasetEntries.list.invalidate();
|
||||||
|
onClose();
|
||||||
|
}, [
|
||||||
|
generateEntriesMutation,
|
||||||
|
onClose,
|
||||||
|
inputDescription,
|
||||||
|
outputDescription,
|
||||||
|
numToGenerate,
|
||||||
|
datasetId,
|
||||||
|
]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Modal isOpen={isOpen} onClose={onClose} size={{ base: "xl", sm: "2xl", md: "3xl" }}>
|
||||||
|
<ModalOverlay />
|
||||||
|
<ModalContent w={1200}>
|
||||||
|
<ModalHeader>
|
||||||
|
<HStack>
|
||||||
|
<Icon as={BsStars} />
|
||||||
|
<Text>Generate Data</Text>
|
||||||
|
</HStack>
|
||||||
|
</ModalHeader>
|
||||||
|
<ModalCloseButton />
|
||||||
|
<ModalBody maxW="unset">
|
||||||
|
<VStack w="full" spacing={8} padding={8} alignItems="flex-start">
|
||||||
|
<VStack alignItems="flex-start" spacing={2}>
|
||||||
|
<Text fontWeight="bold">Number of Rows:</Text>
|
||||||
|
<NumberInput
|
||||||
|
step={5}
|
||||||
|
defaultValue={15}
|
||||||
|
min={0}
|
||||||
|
max={100}
|
||||||
|
onChange={(valueString) => setNumToGenerate(parseInt(valueString) || 0)}
|
||||||
|
value={numToGenerate}
|
||||||
|
w="24"
|
||||||
|
>
|
||||||
|
<NumberInputField />
|
||||||
|
<NumberInputStepper>
|
||||||
|
<NumberIncrementStepper />
|
||||||
|
<NumberDecrementStepper />
|
||||||
|
</NumberInputStepper>
|
||||||
|
</NumberInput>
|
||||||
|
</VStack>
|
||||||
|
<VStack alignItems="flex-start" w="full" spacing={2}>
|
||||||
|
<Text fontWeight="bold">Input Description:</Text>
|
||||||
|
<AutoResizeTextArea
|
||||||
|
value={inputDescription}
|
||||||
|
onChange={(e) => setInputDescription(e.target.value)}
|
||||||
|
placeholder="Each input should contain..."
|
||||||
|
/>
|
||||||
|
</VStack>
|
||||||
|
<VStack alignItems="flex-start" w="full" spacing={2}>
|
||||||
|
<Text fontWeight="bold">Output Description (optional):</Text>
|
||||||
|
<AutoResizeTextArea
|
||||||
|
value={outputDescription}
|
||||||
|
onChange={(e) => setOutputDescription(e.target.value)}
|
||||||
|
placeholder="The output should contain..."
|
||||||
|
/>
|
||||||
|
</VStack>
|
||||||
|
</VStack>
|
||||||
|
</ModalBody>
|
||||||
|
<ModalFooter>
|
||||||
|
<Button
|
||||||
|
colorScheme="blue"
|
||||||
|
isLoading={generateEntriesInProgress}
|
||||||
|
isDisabled={!numToGenerate || !inputDescription || !outputDescription}
|
||||||
|
onClick={generateEntries}
|
||||||
|
>
|
||||||
|
Generate
|
||||||
|
</Button>
|
||||||
|
</ModalFooter>
|
||||||
|
</ModalContent>
|
||||||
|
</Modal>
|
||||||
|
);
|
||||||
|
};
|
||||||
13
app/src/components/datasets/TableRow.tsx
Normal file
13
app/src/components/datasets/TableRow.tsx
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
import { Td, Tr } from "@chakra-ui/react";
|
||||||
|
import { type DatasetEntry } from "@prisma/client";
|
||||||
|
|
||||||
|
const TableRow = ({ entry }: { entry: DatasetEntry }) => {
|
||||||
|
return (
|
||||||
|
<Tr key={entry.id}>
|
||||||
|
<Td>{entry.input}</Td>
|
||||||
|
<Td>{entry.output}</Td>
|
||||||
|
</Tr>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default TableRow;
|
||||||
@@ -1,65 +0,0 @@
|
|||||||
import { Card, Table, Thead, Tr, Th, Tbody, Td, VStack, Icon, Text } from "@chakra-ui/react";
|
|
||||||
import { FaTable } from "react-icons/fa";
|
|
||||||
import { type FineTuneStatus } from "@prisma/client";
|
|
||||||
|
|
||||||
import dayjs from "~/utils/dayjs";
|
|
||||||
import { useFineTunes } from "~/utils/hooks";
|
|
||||||
|
|
||||||
const FineTunesTable = ({}) => {
|
|
||||||
const { data } = useFineTunes();
|
|
||||||
|
|
||||||
const fineTunes = data?.fineTunes || [];
|
|
||||||
|
|
||||||
return (
|
|
||||||
<Card width="100%" overflowX="auto">
|
|
||||||
{fineTunes.length ? (
|
|
||||||
<Table>
|
|
||||||
<Thead>
|
|
||||||
<Tr>
|
|
||||||
<Th>ID</Th>
|
|
||||||
<Th>Created At</Th>
|
|
||||||
<Th>Base Model</Th>
|
|
||||||
<Th>Dataset Size</Th>
|
|
||||||
<Th>Status</Th>
|
|
||||||
</Tr>
|
|
||||||
</Thead>
|
|
||||||
<Tbody>
|
|
||||||
{fineTunes.map((fineTune) => {
|
|
||||||
return (
|
|
||||||
<Tr key={fineTune.id}>
|
|
||||||
<Td>{fineTune.slug}</Td>
|
|
||||||
<Td>{dayjs(fineTune.createdAt).format("MMMM D h:mm A")}</Td>
|
|
||||||
<Td>{fineTune.baseModel}</Td>
|
|
||||||
<Td>{fineTune.dataset._count.datasetEntries}</Td>
|
|
||||||
<Td fontSize="sm" fontWeight="bold">
|
|
||||||
<Text color={getStatusColor(fineTune.status)}>{fineTune.status}</Text>
|
|
||||||
</Td>
|
|
||||||
</Tr>
|
|
||||||
);
|
|
||||||
})}
|
|
||||||
</Tbody>
|
|
||||||
</Table>
|
|
||||||
) : (
|
|
||||||
<VStack py={8}>
|
|
||||||
<Icon as={FaTable} boxSize={16} color="gray.300" />
|
|
||||||
<Text color="gray.400" fontSize="lg" fontWeight="bold">
|
|
||||||
No Fine Tunes Found
|
|
||||||
</Text>
|
|
||||||
</VStack>
|
|
||||||
)}
|
|
||||||
</Card>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default FineTunesTable;
|
|
||||||
|
|
||||||
const getStatusColor = (status: FineTuneStatus) => {
|
|
||||||
switch (status) {
|
|
||||||
case "DEPLOYED":
|
|
||||||
return "green.500";
|
|
||||||
case "ERROR":
|
|
||||||
return "red.500";
|
|
||||||
default:
|
|
||||||
return "yellow.500";
|
|
||||||
}
|
|
||||||
};
|
|
||||||
@@ -15,14 +15,12 @@ import Head from "next/head";
|
|||||||
import Link from "next/link";
|
import Link from "next/link";
|
||||||
import { BsGearFill, BsGithub, BsPersonCircle } from "react-icons/bs";
|
import { BsGearFill, BsGithub, BsPersonCircle } from "react-icons/bs";
|
||||||
import { IoStatsChartOutline } from "react-icons/io5";
|
import { IoStatsChartOutline } from "react-icons/io5";
|
||||||
import { RiHome3Line, RiFlaskLine } from "react-icons/ri";
|
import { RiHome3Line, RiDatabase2Line, RiFlaskLine } from "react-icons/ri";
|
||||||
import { FaRobot } from "react-icons/fa";
|
|
||||||
import { signIn, useSession } from "next-auth/react";
|
import { signIn, useSession } from "next-auth/react";
|
||||||
|
import { env } from "~/env.mjs";
|
||||||
import ProjectMenu from "./ProjectMenu";
|
import ProjectMenu from "./ProjectMenu";
|
||||||
import NavSidebarOption from "./NavSidebarOption";
|
import NavSidebarOption from "./NavSidebarOption";
|
||||||
import IconLink from "./IconLink";
|
import IconLink from "./IconLink";
|
||||||
import { BetaModal } from "./BetaModal";
|
|
||||||
import { useAppStore } from "~/state/store";
|
|
||||||
|
|
||||||
const Divider = () => <Box h="1px" bgColor="gray.300" w="full" />;
|
const Divider = () => <Box h="1px" bgColor="gray.300" w="full" />;
|
||||||
|
|
||||||
@@ -73,10 +71,21 @@ const NavSidebar = () => {
|
|||||||
<ProjectMenu />
|
<ProjectMenu />
|
||||||
<Divider />
|
<Divider />
|
||||||
|
|
||||||
<IconLink icon={RiHome3Line} label="Dashboard" href="/dashboard" beta />
|
{env.NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS && (
|
||||||
<IconLink icon={IoStatsChartOutline} label="Request Logs" href="/request-logs" beta />
|
<>
|
||||||
<IconLink icon={FaRobot} label="Fine Tunes" href="/fine-tunes" beta />
|
<IconLink icon={RiHome3Line} label="Dashboard" href="/dashboard" beta />
|
||||||
|
<IconLink
|
||||||
|
icon={IoStatsChartOutline}
|
||||||
|
label="Request Logs"
|
||||||
|
href="/request-logs"
|
||||||
|
beta
|
||||||
|
/>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
<IconLink icon={RiFlaskLine} label="Experiments" href="/experiments" />
|
<IconLink icon={RiFlaskLine} label="Experiments" href="/experiments" />
|
||||||
|
{env.NEXT_PUBLIC_SHOW_DATA && (
|
||||||
|
<IconLink icon={RiDatabase2Line} label="Data" href="/data" />
|
||||||
|
)}
|
||||||
<VStack w="full" alignItems="flex-start" spacing={0} pt={8}>
|
<VStack w="full" alignItems="flex-start" spacing={0} pt={8}>
|
||||||
<Text
|
<Text
|
||||||
pl={2}
|
pl={2}
|
||||||
@@ -96,7 +105,7 @@ const NavSidebar = () => {
|
|||||||
<NavSidebarOption>
|
<NavSidebarOption>
|
||||||
<HStack
|
<HStack
|
||||||
w="full"
|
w="full"
|
||||||
p={{ base: 2, md: 4 }}
|
p={4}
|
||||||
as={ChakraLink}
|
as={ChakraLink}
|
||||||
justifyContent="start"
|
justifyContent="start"
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
@@ -132,12 +141,10 @@ export default function AppShell({
|
|||||||
children,
|
children,
|
||||||
title,
|
title,
|
||||||
requireAuth,
|
requireAuth,
|
||||||
requireBeta,
|
|
||||||
}: {
|
}: {
|
||||||
children: React.ReactNode;
|
children: React.ReactNode;
|
||||||
title?: string;
|
title?: string;
|
||||||
requireAuth?: boolean;
|
requireAuth?: boolean;
|
||||||
requireBeta?: boolean;
|
|
||||||
}) {
|
}) {
|
||||||
const [vh, setVh] = useState("100vh"); // Default height to prevent flicker on initial render
|
const [vh, setVh] = useState("100vh"); // Default height to prevent flicker on initial render
|
||||||
|
|
||||||
@@ -167,21 +174,15 @@ export default function AppShell({
|
|||||||
}
|
}
|
||||||
}, [requireAuth, user, authLoading]);
|
}, [requireAuth, user, authLoading]);
|
||||||
|
|
||||||
const flags = useAppStore((s) => s.featureFlags.featureFlags);
|
|
||||||
const flagsLoaded = useAppStore((s) => s.featureFlags.flagsLoaded);
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<Flex h={vh} w="100vw">
|
||||||
<Flex h={vh} w="100vw">
|
<Head>
|
||||||
<Head>
|
<title>{title ? `${title} | OpenPipe` : "OpenPipe"}</title>
|
||||||
<title>{title ? `${title} | OpenPipe` : "OpenPipe"}</title>
|
</Head>
|
||||||
</Head>
|
<NavSidebar />
|
||||||
<NavSidebar />
|
<Box h="100%" flex={1} overflowY="auto" bgColor="gray.50">
|
||||||
<Box h="100%" flex={1} overflowY="auto" bgColor="gray.50">
|
{children}
|
||||||
{children}
|
</Box>
|
||||||
</Box>
|
</Flex>
|
||||||
</Flex>
|
|
||||||
{requireBeta && flagsLoaded && !flags.betaAccess && <BetaModal />}
|
|
||||||
</>
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,67 +0,0 @@
|
|||||||
import {
|
|
||||||
Button,
|
|
||||||
Modal,
|
|
||||||
ModalBody,
|
|
||||||
ModalContent,
|
|
||||||
ModalFooter,
|
|
||||||
ModalHeader,
|
|
||||||
ModalOverlay,
|
|
||||||
VStack,
|
|
||||||
Text,
|
|
||||||
HStack,
|
|
||||||
Icon,
|
|
||||||
Link,
|
|
||||||
} from "@chakra-ui/react";
|
|
||||||
import { BsStars } from "react-icons/bs";
|
|
||||||
import { useRouter } from "next/router";
|
|
||||||
import { useSession } from "next-auth/react";
|
|
||||||
|
|
||||||
export const BetaModal = () => {
|
|
||||||
const router = useRouter();
|
|
||||||
const session = useSession();
|
|
||||||
|
|
||||||
const email = session.data?.user.email ?? "";
|
|
||||||
|
|
||||||
return (
|
|
||||||
<Modal
|
|
||||||
isOpen
|
|
||||||
onClose={router.back}
|
|
||||||
closeOnOverlayClick={false}
|
|
||||||
size={{ base: "xl", md: "2xl" }}
|
|
||||||
>
|
|
||||||
<ModalOverlay />
|
|
||||||
<ModalContent w={1200}>
|
|
||||||
<ModalHeader>
|
|
||||||
<HStack>
|
|
||||||
<Icon as={BsStars} />
|
|
||||||
<Text>Beta-Only Feature</Text>
|
|
||||||
</HStack>
|
|
||||||
</ModalHeader>
|
|
||||||
<ModalBody maxW="unset">
|
|
||||||
<VStack spacing={8} py={4} alignItems="flex-start">
|
|
||||||
<Text fontSize="md">
|
|
||||||
This feature is currently in beta. To receive early access to beta-only features, join
|
|
||||||
the waitlist. You'll receive an email at <b>{email}</b> when you're approved.
|
|
||||||
</Text>
|
|
||||||
</VStack>
|
|
||||||
</ModalBody>
|
|
||||||
<ModalFooter>
|
|
||||||
<HStack spacing={4}>
|
|
||||||
<Button
|
|
||||||
as={Link}
|
|
||||||
textDecoration="none !important"
|
|
||||||
colorScheme="orange"
|
|
||||||
target="_blank"
|
|
||||||
href={`https://ax3nafkw0jp.typeform.com/to/ZNpYqvAc#email=${email}`}
|
|
||||||
>
|
|
||||||
Join Waitlist
|
|
||||||
</Button>
|
|
||||||
<Button colorScheme="blue" onClick={router.back}>
|
|
||||||
Done
|
|
||||||
</Button>
|
|
||||||
</HStack>
|
|
||||||
</ModalFooter>
|
|
||||||
</ModalContent>
|
|
||||||
</Modal>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
@@ -14,7 +14,6 @@ import {
|
|||||||
Link as ChakraLink,
|
Link as ChakraLink,
|
||||||
Image,
|
Image,
|
||||||
Box,
|
Box,
|
||||||
Portal,
|
|
||||||
} from "@chakra-ui/react";
|
} from "@chakra-ui/react";
|
||||||
import { useEffect } from "react";
|
import { useEffect } from "react";
|
||||||
import Link from "next/link";
|
import Link from "next/link";
|
||||||
@@ -110,66 +109,64 @@ export default function ProjectMenu() {
|
|||||||
</HStack>
|
</HStack>
|
||||||
</NavSidebarOption>
|
</NavSidebarOption>
|
||||||
</PopoverTrigger>
|
</PopoverTrigger>
|
||||||
<Portal>
|
<PopoverContent
|
||||||
<PopoverContent
|
_focusVisible={{ outline: "unset" }}
|
||||||
_focusVisible={{ outline: "unset" }}
|
w={220}
|
||||||
w={220}
|
ml={{ base: 2, md: 0 }}
|
||||||
ml={{ base: 2, md: 0 }}
|
boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);"
|
||||||
boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);"
|
fontSize="sm"
|
||||||
fontSize="sm"
|
>
|
||||||
>
|
<VStack alignItems="flex-start" spacing={1} py={1}>
|
||||||
<VStack alignItems="flex-start" spacing={1} py={1}>
|
<Text px={3} py={2}>
|
||||||
<Text px={3} py={2}>
|
{user?.user.email}
|
||||||
{user?.user.email}
|
</Text>
|
||||||
</Text>
|
<Divider />
|
||||||
<Divider />
|
<Text alignSelf="flex-start" fontWeight="bold" px={3} pt={2}>
|
||||||
<Text alignSelf="flex-start" fontWeight="bold" px={3} pt={2}>
|
Your Projects
|
||||||
Your Projects
|
</Text>
|
||||||
</Text>
|
<VStack spacing={0} w="full" px={1}>
|
||||||
<VStack spacing={0} w="full" px={1}>
|
{projects?.map((proj) => (
|
||||||
{projects?.map((proj) => (
|
<ProjectOption
|
||||||
<ProjectOption
|
key={proj.id}
|
||||||
key={proj.id}
|
proj={proj}
|
||||||
proj={proj}
|
isActive={proj.id === selectedProjectId}
|
||||||
isActive={proj.id === selectedProjectId}
|
onClose={popover.onClose}
|
||||||
onClose={popover.onClose}
|
/>
|
||||||
/>
|
))}
|
||||||
))}
|
<HStack
|
||||||
<HStack
|
as={Button}
|
||||||
as={Button}
|
variant="ghost"
|
||||||
variant="ghost"
|
colorScheme="blue"
|
||||||
colorScheme="blue"
|
color="blue.400"
|
||||||
color="blue.400"
|
fontSize="sm"
|
||||||
fontSize="sm"
|
justifyContent="flex-start"
|
||||||
justifyContent="flex-start"
|
onClick={createProject}
|
||||||
onClick={createProject}
|
w="full"
|
||||||
w="full"
|
borderRadius={4}
|
||||||
borderRadius={4}
|
spacing={0}
|
||||||
spacing={0}
|
>
|
||||||
>
|
<Text>Add project</Text>
|
||||||
<Text>Add project</Text>
|
<Icon as={isLoading ? Spinner : BsPlus} boxSize={4} strokeWidth={0.5} />
|
||||||
<Icon as={isLoading ? Spinner : BsPlus} boxSize={4} strokeWidth={0.5} />
|
</HStack>
|
||||||
</HStack>
|
|
||||||
</VStack>
|
|
||||||
|
|
||||||
<Divider />
|
|
||||||
<VStack w="full" px={1}>
|
|
||||||
<ChakraLink
|
|
||||||
onClick={() => {
|
|
||||||
signOut().catch(console.error);
|
|
||||||
}}
|
|
||||||
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
|
|
||||||
w="full"
|
|
||||||
py={2}
|
|
||||||
px={2}
|
|
||||||
borderRadius={4}
|
|
||||||
>
|
|
||||||
<Text>Sign out</Text>
|
|
||||||
</ChakraLink>
|
|
||||||
</VStack>
|
|
||||||
</VStack>
|
</VStack>
|
||||||
</PopoverContent>
|
|
||||||
</Portal>
|
<Divider />
|
||||||
|
<VStack w="full" px={1}>
|
||||||
|
<ChakraLink
|
||||||
|
onClick={() => {
|
||||||
|
signOut().catch(console.error);
|
||||||
|
}}
|
||||||
|
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
|
||||||
|
w="full"
|
||||||
|
py={2}
|
||||||
|
px={2}
|
||||||
|
borderRadius={4}
|
||||||
|
>
|
||||||
|
<Text>Sign out</Text>
|
||||||
|
</ChakraLink>
|
||||||
|
</VStack>
|
||||||
|
</VStack>
|
||||||
|
</PopoverContent>
|
||||||
</Popover>
|
</Popover>
|
||||||
</VStack>
|
</VStack>
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -23,48 +23,50 @@ export default function UserMenu({ user, ...rest }: { user: Session } & StackPro
|
|||||||
);
|
);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Popover placement="right">
|
<>
|
||||||
<PopoverTrigger>
|
<Popover placement="right">
|
||||||
<NavSidebarOption>
|
<PopoverTrigger>
|
||||||
<HStack
|
<NavSidebarOption>
|
||||||
// Weird values to make mobile look right; can clean up when we make the sidebar disappear on mobile
|
<HStack
|
||||||
py={2}
|
// Weird values to make mobile look right; can clean up when we make the sidebar disappear on mobile
|
||||||
px={1}
|
py={2}
|
||||||
spacing={3}
|
px={1}
|
||||||
{...rest}
|
spacing={3}
|
||||||
>
|
{...rest}
|
||||||
{profileImage}
|
>
|
||||||
<VStack spacing={0} align="start" flex={1} flexShrink={1}>
|
{profileImage}
|
||||||
<Text fontWeight="bold" fontSize="sm">
|
<VStack spacing={0} align="start" flex={1} flexShrink={1}>
|
||||||
{user.user.name}
|
<Text fontWeight="bold" fontSize="sm">
|
||||||
</Text>
|
{user.user.name}
|
||||||
<Text color="gray.500" fontSize="xs">
|
</Text>
|
||||||
{/* {user.user.email} */}
|
<Text color="gray.500" fontSize="xs">
|
||||||
</Text>
|
{/* {user.user.email} */}
|
||||||
</VStack>
|
</Text>
|
||||||
<Icon as={BsChevronRight} boxSize={4} color="gray.500" />
|
</VStack>
|
||||||
</HStack>
|
<Icon as={BsChevronRight} boxSize={4} color="gray.500" />
|
||||||
</NavSidebarOption>
|
</HStack>
|
||||||
</PopoverTrigger>
|
</NavSidebarOption>
|
||||||
<PopoverContent _focusVisible={{ outline: "unset" }} ml={-1} minW={48} w="full">
|
</PopoverTrigger>
|
||||||
<VStack align="stretch" spacing={0}>
|
<PopoverContent _focusVisible={{ outline: "unset" }} ml={-1} minW={48} w="full">
|
||||||
{/* sign out */}
|
<VStack align="stretch" spacing={0}>
|
||||||
<HStack
|
{/* sign out */}
|
||||||
as={Link}
|
<HStack
|
||||||
onClick={() => {
|
as={Link}
|
||||||
signOut().catch(console.error);
|
onClick={() => {
|
||||||
}}
|
signOut().catch(console.error);
|
||||||
px={4}
|
}}
|
||||||
py={2}
|
px={4}
|
||||||
spacing={4}
|
py={2}
|
||||||
color="gray.500"
|
spacing={4}
|
||||||
fontSize="sm"
|
color="gray.500"
|
||||||
>
|
fontSize="sm"
|
||||||
<Icon as={BsBoxArrowRight} boxSize={6} />
|
>
|
||||||
<Text>Sign out</Text>
|
<Icon as={BsBoxArrowRight} boxSize={6} />
|
||||||
</HStack>
|
<Text>Sign out</Text>
|
||||||
</VStack>
|
</HStack>
|
||||||
</PopoverContent>
|
</VStack>
|
||||||
</Popover>
|
</PopoverContent>
|
||||||
|
</Popover>
|
||||||
|
</>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ const ActionButton = ({
|
|||||||
>
|
>
|
||||||
<HStack spacing={1}>
|
<HStack spacing={1}>
|
||||||
{icon && <Icon as={icon} />}
|
{icon && <Icon as={icon} />}
|
||||||
<Text display={{ base: "none", md: "flex" }}>{label}</Text>
|
<Text>{label}</Text>
|
||||||
</HStack>
|
</HStack>
|
||||||
</Button>
|
</Button>
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -1,117 +0,0 @@
|
|||||||
import {
|
|
||||||
Icon,
|
|
||||||
Popover,
|
|
||||||
PopoverTrigger,
|
|
||||||
PopoverContent,
|
|
||||||
VStack,
|
|
||||||
HStack,
|
|
||||||
Button,
|
|
||||||
Text,
|
|
||||||
useDisclosure,
|
|
||||||
Box,
|
|
||||||
} from "@chakra-ui/react";
|
|
||||||
import { BiCheck } from "react-icons/bi";
|
|
||||||
import { BsToggles } from "react-icons/bs";
|
|
||||||
import { useMemo } from "react";
|
|
||||||
|
|
||||||
import { useIsClientRehydrated, useTagNames } from "~/utils/hooks";
|
|
||||||
import { useAppStore } from "~/state/store";
|
|
||||||
import { StaticColumnKeys } from "~/state/columnVisiblitySlice";
|
|
||||||
import ActionButton from "./ActionButton";
|
|
||||||
|
|
||||||
const ColumnVisiblityDropdown = () => {
|
|
||||||
const tagNames = useTagNames().data;
|
|
||||||
|
|
||||||
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
|
|
||||||
const toggleColumnVisibility = useAppStore((s) => s.columnVisibility.toggleColumnVisibility);
|
|
||||||
const totalColumns = Object.keys(StaticColumnKeys).length + (tagNames?.length ?? 0);
|
|
||||||
|
|
||||||
const popover = useDisclosure();
|
|
||||||
|
|
||||||
const columnVisiblityOptions = useMemo(() => {
|
|
||||||
const options: { label: string; key: string }[] = [
|
|
||||||
{
|
|
||||||
label: "Sent At",
|
|
||||||
key: StaticColumnKeys.SENT_AT,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: "Model",
|
|
||||||
key: StaticColumnKeys.MODEL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: "Duration",
|
|
||||||
key: StaticColumnKeys.DURATION,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: "Input Tokens",
|
|
||||||
key: StaticColumnKeys.INPUT_TOKENS,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: "Output Tokens",
|
|
||||||
key: StaticColumnKeys.OUTPUT_TOKENS,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
label: "Status Code",
|
|
||||||
key: StaticColumnKeys.STATUS_CODE,
|
|
||||||
},
|
|
||||||
];
|
|
||||||
for (const tagName of tagNames ?? []) {
|
|
||||||
options.push({
|
|
||||||
label: tagName,
|
|
||||||
key: tagName,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
return options;
|
|
||||||
}, [tagNames]);
|
|
||||||
|
|
||||||
const isClientRehydrated = useIsClientRehydrated();
|
|
||||||
if (!isClientRehydrated) return null;
|
|
||||||
|
|
||||||
return (
|
|
||||||
<Popover
|
|
||||||
placement="bottom-start"
|
|
||||||
isOpen={popover.isOpen}
|
|
||||||
onOpen={popover.onOpen}
|
|
||||||
onClose={popover.onClose}
|
|
||||||
>
|
|
||||||
<PopoverTrigger>
|
|
||||||
<Box>
|
|
||||||
<ActionButton
|
|
||||||
label={`Columns (${visibleColumns.size}/${totalColumns})`}
|
|
||||||
icon={BsToggles}
|
|
||||||
/>
|
|
||||||
</Box>
|
|
||||||
</PopoverTrigger>
|
|
||||||
<PopoverContent boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);" minW={0} w="auto">
|
|
||||||
<VStack spacing={0} maxH={400} overflowY="auto">
|
|
||||||
{columnVisiblityOptions?.map((option, index) => (
|
|
||||||
<HStack
|
|
||||||
key={index}
|
|
||||||
as={Button}
|
|
||||||
onClick={() => toggleColumnVisibility(option.key)}
|
|
||||||
w="full"
|
|
||||||
minH={10}
|
|
||||||
variant="ghost"
|
|
||||||
justifyContent="space-between"
|
|
||||||
fontWeight="semibold"
|
|
||||||
borderRadius={0}
|
|
||||||
colorScheme="blue"
|
|
||||||
color="black"
|
|
||||||
fontSize="sm"
|
|
||||||
borderBottomWidth={1}
|
|
||||||
>
|
|
||||||
<Text mr={16}>{option.label}</Text>
|
|
||||||
<Box w={5}>
|
|
||||||
{visibleColumns.has(option.key) && (
|
|
||||||
<Icon as={BiCheck} color="blue.500" boxSize={5} />
|
|
||||||
)}
|
|
||||||
</Box>
|
|
||||||
</HStack>
|
|
||||||
))}
|
|
||||||
</VStack>
|
|
||||||
</PopoverContent>
|
|
||||||
</Popover>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default ColumnVisiblityDropdown;
|
|
||||||
@@ -1,210 +0,0 @@
|
|||||||
import { useState, useEffect } from "react";
|
|
||||||
import {
|
|
||||||
Modal,
|
|
||||||
ModalOverlay,
|
|
||||||
ModalContent,
|
|
||||||
ModalHeader,
|
|
||||||
ModalCloseButton,
|
|
||||||
ModalBody,
|
|
||||||
ModalFooter,
|
|
||||||
HStack,
|
|
||||||
VStack,
|
|
||||||
Icon,
|
|
||||||
Text,
|
|
||||||
Button,
|
|
||||||
Checkbox,
|
|
||||||
NumberInput,
|
|
||||||
NumberInputField,
|
|
||||||
NumberInputStepper,
|
|
||||||
NumberIncrementStepper,
|
|
||||||
NumberDecrementStepper,
|
|
||||||
Collapse,
|
|
||||||
Flex,
|
|
||||||
useDisclosure,
|
|
||||||
type UseDisclosureReturn,
|
|
||||||
} from "@chakra-ui/react";
|
|
||||||
import { BiExport } from "react-icons/bi";
|
|
||||||
|
|
||||||
import { useHandledAsyncCallback } from "~/utils/hooks";
|
|
||||||
import { api } from "~/utils/api";
|
|
||||||
import { useAppStore } from "~/state/store";
|
|
||||||
import ActionButton from "./ActionButton";
|
|
||||||
import InputDropdown from "../InputDropdown";
|
|
||||||
import { FiChevronUp, FiChevronDown } from "react-icons/fi";
|
|
||||||
import InfoCircle from "../InfoCircle";
|
|
||||||
|
|
||||||
const SUPPORTED_EXPORT_FORMATS = ["alpaca-finetune", "openai-fine-tune", "unformatted"];
|
|
||||||
|
|
||||||
const ExportButton = () => {
|
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
|
||||||
|
|
||||||
const disclosure = useDisclosure();
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<ActionButton
|
|
||||||
onClick={disclosure.onOpen}
|
|
||||||
label="Export"
|
|
||||||
icon={BiExport}
|
|
||||||
isDisabled={selectedLogIds.size === 0}
|
|
||||||
/>
|
|
||||||
<ExportLogsModal disclosure={disclosure} />
|
|
||||||
</>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default ExportButton;
|
|
||||||
|
|
||||||
const ExportLogsModal = ({ disclosure }: { disclosure: UseDisclosureReturn }) => {
|
|
||||||
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
|
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
|
||||||
const clearSelectedLogIds = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
|
|
||||||
|
|
||||||
const [selectedExportFormat, setSelectedExportFormat] = useState(SUPPORTED_EXPORT_FORMATS[0]);
|
|
||||||
const [testingSplit, setTestingSplit] = useState(10);
|
|
||||||
const [removeDuplicates, setRemoveDuplicates] = useState(true);
|
|
||||||
const [showAdvancedOptions, setShowAdvancedOptions] = useState(false);
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (disclosure.isOpen) {
|
|
||||||
setSelectedExportFormat(SUPPORTED_EXPORT_FORMATS[0]);
|
|
||||||
setTestingSplit(10);
|
|
||||||
setRemoveDuplicates(true);
|
|
||||||
}
|
|
||||||
}, [disclosure.isOpen]);
|
|
||||||
|
|
||||||
const exportLogsMutation = api.loggedCalls.export.useMutation();
|
|
||||||
|
|
||||||
const [exportLogs, exportInProgress] = useHandledAsyncCallback(async () => {
|
|
||||||
if (!selectedProjectId || !selectedLogIds.size || !testingSplit || !selectedExportFormat)
|
|
||||||
return;
|
|
||||||
const response = await exportLogsMutation.mutateAsync({
|
|
||||||
projectId: selectedProjectId,
|
|
||||||
selectedLogIds: Array.from(selectedLogIds),
|
|
||||||
testingSplit,
|
|
||||||
selectedExportFormat,
|
|
||||||
removeDuplicates,
|
|
||||||
});
|
|
||||||
|
|
||||||
const dataUrl = `data:application/pdf;base64,${response}`;
|
|
||||||
const blob = await fetch(dataUrl).then((res) => res.blob());
|
|
||||||
const url = URL.createObjectURL(blob);
|
|
||||||
const a = document.createElement("a");
|
|
||||||
|
|
||||||
a.href = url;
|
|
||||||
a.download = `data.zip`;
|
|
||||||
document.body.appendChild(a);
|
|
||||||
a.click();
|
|
||||||
document.body.removeChild(a);
|
|
||||||
|
|
||||||
disclosure.onClose();
|
|
||||||
clearSelectedLogIds();
|
|
||||||
}, [
|
|
||||||
exportLogsMutation,
|
|
||||||
selectedProjectId,
|
|
||||||
selectedLogIds,
|
|
||||||
testingSplit,
|
|
||||||
selectedExportFormat,
|
|
||||||
removeDuplicates,
|
|
||||||
]);
|
|
||||||
|
|
||||||
return (
|
|
||||||
<Modal size={{ base: "xl", md: "2xl" }} {...disclosure}>
|
|
||||||
<ModalOverlay />
|
|
||||||
<ModalContent w={1200}>
|
|
||||||
<ModalHeader>
|
|
||||||
<HStack>
|
|
||||||
<Icon as={BiExport} />
|
|
||||||
<Text>Export Logs</Text>
|
|
||||||
</HStack>
|
|
||||||
</ModalHeader>
|
|
||||||
<ModalCloseButton />
|
|
||||||
<ModalBody maxW="unset">
|
|
||||||
<VStack w="full" spacing={8} pt={4} alignItems="flex-start">
|
|
||||||
<Text>
|
|
||||||
We'll export the <b>{selectedLogIds.size}</b> logs you have selected in the format of
|
|
||||||
your choice.
|
|
||||||
</Text>
|
|
||||||
<VStack alignItems="flex-start" spacing={4}>
|
|
||||||
<Flex
|
|
||||||
flexDir={{ base: "column", md: "row" }}
|
|
||||||
alignItems={{ base: "flex-start", md: "center" }}
|
|
||||||
>
|
|
||||||
<HStack w={48} alignItems="center" spacing={1}>
|
|
||||||
<Text fontWeight="bold">Format:</Text>
|
|
||||||
<InfoCircle tooltipText="Format logs for for fine tuning or export them without formatting." />
|
|
||||||
</HStack>
|
|
||||||
<InputDropdown
|
|
||||||
options={SUPPORTED_EXPORT_FORMATS}
|
|
||||||
selectedOption={selectedExportFormat}
|
|
||||||
onSelect={(option) => setSelectedExportFormat(option)}
|
|
||||||
inputGroupProps={{ w: 48 }}
|
|
||||||
/>
|
|
||||||
</Flex>
|
|
||||||
<Flex
|
|
||||||
flexDir={{ base: "column", md: "row" }}
|
|
||||||
alignItems={{ base: "flex-start", md: "center" }}
|
|
||||||
>
|
|
||||||
<HStack w={48} alignItems="center" spacing={1}>
|
|
||||||
<Text fontWeight="bold">Testing Split:</Text>
|
|
||||||
<InfoCircle tooltipText="The percent of your logs that will be reserved for testing and saved in another file. Logs are split randomly." />
|
|
||||||
</HStack>
|
|
||||||
<HStack>
|
|
||||||
<NumberInput
|
|
||||||
defaultValue={10}
|
|
||||||
onChange={(_, num) => setTestingSplit(num)}
|
|
||||||
min={1}
|
|
||||||
max={100}
|
|
||||||
w={48}
|
|
||||||
>
|
|
||||||
<NumberInputField />
|
|
||||||
<NumberInputStepper>
|
|
||||||
<NumberIncrementStepper />
|
|
||||||
<NumberDecrementStepper />
|
|
||||||
</NumberInputStepper>
|
|
||||||
</NumberInput>
|
|
||||||
</HStack>
|
|
||||||
</Flex>
|
|
||||||
</VStack>
|
|
||||||
<VStack alignItems="flex-start" spacing={0}>
|
|
||||||
<Button
|
|
||||||
variant="unstyled"
|
|
||||||
color="blue.600"
|
|
||||||
onClick={() => setShowAdvancedOptions(!showAdvancedOptions)}
|
|
||||||
>
|
|
||||||
<HStack>
|
|
||||||
<Text>Advanced Options</Text>
|
|
||||||
<Icon as={showAdvancedOptions ? FiChevronUp : FiChevronDown} />
|
|
||||||
</HStack>
|
|
||||||
</Button>
|
|
||||||
<Collapse in={showAdvancedOptions} unmountOnExit={true}>
|
|
||||||
<VStack align="stretch" pt={4}>
|
|
||||||
<HStack>
|
|
||||||
<Checkbox
|
|
||||||
colorScheme="blue"
|
|
||||||
isChecked={removeDuplicates}
|
|
||||||
onChange={(e) => setRemoveDuplicates(e.target.checked)}
|
|
||||||
>
|
|
||||||
<Text>Remove duplicates</Text>
|
|
||||||
</Checkbox>
|
|
||||||
<InfoCircle tooltipText="To avoid overfitting and speed up training, automatically deduplicate logs with matching input and output." />
|
|
||||||
</HStack>
|
|
||||||
</VStack>
|
|
||||||
</Collapse>
|
|
||||||
</VStack>
|
|
||||||
</VStack>
|
|
||||||
</ModalBody>
|
|
||||||
<ModalFooter>
|
|
||||||
<HStack>
|
|
||||||
<Button colorScheme="gray" onClick={disclosure.onClose} minW={24}>
|
|
||||||
Cancel
|
|
||||||
</Button>
|
|
||||||
<Button colorScheme="blue" onClick={exportLogs} isLoading={exportInProgress} minW={24}>
|
|
||||||
Export
|
|
||||||
</Button>
|
|
||||||
</HStack>
|
|
||||||
</ModalFooter>
|
|
||||||
</ModalContent>
|
|
||||||
</Modal>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
@@ -1,161 +0,0 @@
|
|||||||
import { useState, useEffect } from "react";
|
|
||||||
import {
|
|
||||||
Modal,
|
|
||||||
ModalOverlay,
|
|
||||||
ModalContent,
|
|
||||||
ModalHeader,
|
|
||||||
ModalCloseButton,
|
|
||||||
ModalBody,
|
|
||||||
ModalFooter,
|
|
||||||
HStack,
|
|
||||||
VStack,
|
|
||||||
Icon,
|
|
||||||
Text,
|
|
||||||
Button,
|
|
||||||
useDisclosure,
|
|
||||||
type UseDisclosureReturn,
|
|
||||||
Input,
|
|
||||||
} from "@chakra-ui/react";
|
|
||||||
import { FaRobot } from "react-icons/fa";
|
|
||||||
import humanId from "human-id";
|
|
||||||
import { useRouter } from "next/router";
|
|
||||||
|
|
||||||
import { useHandledAsyncCallback } from "~/utils/hooks";
|
|
||||||
import { api } from "~/utils/api";
|
|
||||||
import { useAppStore } from "~/state/store";
|
|
||||||
import ActionButton from "./ActionButton";
|
|
||||||
import InputDropdown from "../InputDropdown";
|
|
||||||
import { FiChevronDown } from "react-icons/fi";
|
|
||||||
|
|
||||||
const SUPPORTED_BASE_MODELS = ["llama2-7b", "llama2-13b", "llama2-70b", "gpt-3.5-turbo"];
|
|
||||||
|
|
||||||
const FineTuneButton = () => {
|
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
|
||||||
|
|
||||||
const disclosure = useDisclosure();
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<ActionButton
|
|
||||||
onClick={disclosure.onOpen}
|
|
||||||
label="Fine Tune"
|
|
||||||
icon={FaRobot}
|
|
||||||
isDisabled={selectedLogIds.size === 0}
|
|
||||||
/>
|
|
||||||
<FineTuneModal disclosure={disclosure} />
|
|
||||||
</>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default FineTuneButton;
|
|
||||||
|
|
||||||
const FineTuneModal = ({ disclosure }: { disclosure: UseDisclosureReturn }) => {
|
|
||||||
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
|
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
|
||||||
const clearSelectedLogIds = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
|
|
||||||
|
|
||||||
const [selectedBaseModel, setSelectedBaseModel] = useState(SUPPORTED_BASE_MODELS[0]);
|
|
||||||
const [modelSlug, setModelSlug] = useState(humanId({ separator: "-", capitalize: false }));
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (disclosure.isOpen) {
|
|
||||||
setSelectedBaseModel(SUPPORTED_BASE_MODELS[0]);
|
|
||||||
setModelSlug(humanId({ separator: "-", capitalize: false }));
|
|
||||||
}
|
|
||||||
}, [disclosure.isOpen]);
|
|
||||||
|
|
||||||
const utils = api.useContext();
|
|
||||||
const router = useRouter();
|
|
||||||
|
|
||||||
const createFineTuneMutation = api.fineTunes.create.useMutation();
|
|
||||||
|
|
||||||
const [createFineTune, creationInProgress] = useHandledAsyncCallback(async () => {
|
|
||||||
if (!selectedProjectId || !modelSlug || !selectedBaseModel || !selectedLogIds.size) return;
|
|
||||||
await createFineTuneMutation.mutateAsync({
|
|
||||||
projectId: selectedProjectId,
|
|
||||||
slug: modelSlug,
|
|
||||||
baseModel: selectedBaseModel,
|
|
||||||
selectedLogIds: Array.from(selectedLogIds),
|
|
||||||
});
|
|
||||||
|
|
||||||
await utils.fineTunes.list.invalidate();
|
|
||||||
await router.push({ pathname: "/fine-tunes" });
|
|
||||||
clearSelectedLogIds();
|
|
||||||
disclosure.onClose();
|
|
||||||
}, [createFineTuneMutation, selectedProjectId, selectedLogIds, modelSlug, selectedBaseModel]);
|
|
||||||
|
|
||||||
return (
|
|
||||||
<Modal size={{ base: "xl", md: "2xl" }} {...disclosure}>
|
|
||||||
<ModalOverlay />
|
|
||||||
<ModalContent w={1200}>
|
|
||||||
<ModalHeader>
|
|
||||||
<HStack>
|
|
||||||
<Icon as={FaRobot} />
|
|
||||||
<Text>Fine Tune</Text>
|
|
||||||
</HStack>
|
|
||||||
</ModalHeader>
|
|
||||||
<ModalCloseButton />
|
|
||||||
<ModalBody maxW="unset">
|
|
||||||
<VStack w="full" spacing={8} pt={4} alignItems="flex-start">
|
|
||||||
<Text>
|
|
||||||
We'll train on the <b>{selectedLogIds.size}</b> logs you've selected.
|
|
||||||
</Text>
|
|
||||||
<VStack>
|
|
||||||
<HStack spacing={2} w="full">
|
|
||||||
<Text fontWeight="bold" w={36}>
|
|
||||||
Model ID:
|
|
||||||
</Text>
|
|
||||||
<Input
|
|
||||||
value={modelSlug}
|
|
||||||
onChange={(e) => setModelSlug(e.target.value)}
|
|
||||||
w={48}
|
|
||||||
placeholder="unique-id"
|
|
||||||
onKeyDown={(e) => {
|
|
||||||
// If the user types anything other than a-z, A-Z, or 0-9, replace it with -
|
|
||||||
if (!/[a-zA-Z0-9]/.test(e.key)) {
|
|
||||||
e.preventDefault();
|
|
||||||
setModelSlug((s) => s && `${s}-`);
|
|
||||||
}
|
|
||||||
}}
|
|
||||||
/>
|
|
||||||
</HStack>
|
|
||||||
<HStack spacing={2}>
|
|
||||||
<Text fontWeight="bold" w={36}>
|
|
||||||
Base model:
|
|
||||||
</Text>
|
|
||||||
<InputDropdown
|
|
||||||
options={SUPPORTED_BASE_MODELS}
|
|
||||||
selectedOption={selectedBaseModel}
|
|
||||||
onSelect={(option) => setSelectedBaseModel(option)}
|
|
||||||
inputGroupProps={{ w: 48 }}
|
|
||||||
/>
|
|
||||||
</HStack>
|
|
||||||
</VStack>
|
|
||||||
<Button variant="unstyled" color="blue.600">
|
|
||||||
<HStack>
|
|
||||||
<Text>Advanced Options</Text>
|
|
||||||
<Icon as={FiChevronDown} />
|
|
||||||
</HStack>
|
|
||||||
</Button>
|
|
||||||
</VStack>
|
|
||||||
</ModalBody>
|
|
||||||
<ModalFooter>
|
|
||||||
<HStack>
|
|
||||||
<Button colorScheme="gray" onClick={disclosure.onClose} minW={24}>
|
|
||||||
Cancel
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
colorScheme="blue"
|
|
||||||
onClick={createFineTune}
|
|
||||||
isLoading={creationInProgress}
|
|
||||||
minW={24}
|
|
||||||
isDisabled={!modelSlug}
|
|
||||||
>
|
|
||||||
Start Training
|
|
||||||
</Button>
|
|
||||||
</HStack>
|
|
||||||
</ModalFooter>
|
|
||||||
</ModalContent>
|
|
||||||
</Modal>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
@@ -10,7 +10,7 @@ export default function LoggedCallsTable() {
|
|||||||
return (
|
return (
|
||||||
<Card width="100%" overflowX="auto">
|
<Card width="100%" overflowX="auto">
|
||||||
<Table>
|
<Table>
|
||||||
<TableHeader showOptions />
|
<TableHeader showCheckbox />
|
||||||
<Tbody>
|
<Tbody>
|
||||||
{loggedCalls?.calls?.map((loggedCall) => {
|
{loggedCalls?.calls?.map((loggedCall) => {
|
||||||
return (
|
return (
|
||||||
@@ -25,7 +25,7 @@ export default function LoggedCallsTable() {
|
|||||||
setExpandedRow(loggedCall.id);
|
setExpandedRow(loggedCall.id);
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
showOptions
|
showCheckbox
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
})}
|
})}
|
||||||
|
|||||||
@@ -14,19 +14,21 @@ import {
|
|||||||
Text,
|
Text,
|
||||||
Checkbox,
|
Checkbox,
|
||||||
} from "@chakra-ui/react";
|
} from "@chakra-ui/react";
|
||||||
|
import dayjs from "dayjs";
|
||||||
|
import relativeTime from "dayjs/plugin/relativeTime";
|
||||||
import Link from "next/link";
|
import Link from "next/link";
|
||||||
|
|
||||||
import dayjs from "~/utils/dayjs";
|
|
||||||
import { type RouterOutputs } from "~/utils/api";
|
import { type RouterOutputs } from "~/utils/api";
|
||||||
import { FormattedJson } from "./FormattedJson";
|
import { FormattedJson } from "./FormattedJson";
|
||||||
import { useAppStore } from "~/state/store";
|
import { useAppStore } from "~/state/store";
|
||||||
import { useIsClientRehydrated, useLoggedCalls, useTagNames } from "~/utils/hooks";
|
import { useLoggedCalls, useTagNames } from "~/utils/hooks";
|
||||||
import { useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { StaticColumnKeys } from "~/state/columnVisiblitySlice";
|
|
||||||
|
dayjs.extend(relativeTime);
|
||||||
|
|
||||||
type LoggedCall = RouterOutputs["loggedCalls"]["list"]["calls"][0];
|
type LoggedCall = RouterOutputs["loggedCalls"]["list"]["calls"][0];
|
||||||
|
|
||||||
export const TableHeader = ({ showOptions }: { showOptions?: boolean }) => {
|
export const TableHeader = ({ showCheckbox }: { showCheckbox?: boolean }) => {
|
||||||
const matchingLogIds = useLoggedCalls().data?.matchingLogIds;
|
const matchingLogIds = useLoggedCalls().data?.matchingLogIds;
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
||||||
const addAll = useAppStore((s) => s.selectedLogs.addSelectedLogIds);
|
const addAll = useAppStore((s) => s.selectedLogs.addSelectedLogIds);
|
||||||
@@ -36,14 +38,10 @@ export const TableHeader = ({ showOptions }: { showOptions?: boolean }) => {
|
|||||||
return matchingLogIds.every((id) => selectedLogIds.has(id));
|
return matchingLogIds.every((id) => selectedLogIds.has(id));
|
||||||
}, [selectedLogIds, matchingLogIds]);
|
}, [selectedLogIds, matchingLogIds]);
|
||||||
const tagNames = useTagNames().data;
|
const tagNames = useTagNames().data;
|
||||||
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
|
|
||||||
const isClientRehydrated = useIsClientRehydrated();
|
|
||||||
if (!isClientRehydrated) return null;
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Thead>
|
<Thead>
|
||||||
<Tr>
|
<Tr>
|
||||||
{showOptions && (
|
{showCheckbox && (
|
||||||
<Th pr={0}>
|
<Th pr={0}>
|
||||||
<HStack minW={16}>
|
<HStack minW={16}>
|
||||||
<Checkbox
|
<Checkbox
|
||||||
@@ -59,19 +57,13 @@ export const TableHeader = ({ showOptions }: { showOptions?: boolean }) => {
|
|||||||
</HStack>
|
</HStack>
|
||||||
</Th>
|
</Th>
|
||||||
)}
|
)}
|
||||||
{visibleColumns.has(StaticColumnKeys.SENT_AT) && <Th>Sent At</Th>}
|
<Th>Sent At</Th>
|
||||||
{visibleColumns.has(StaticColumnKeys.MODEL) && <Th>Model</Th>}
|
<Th>Model</Th>
|
||||||
{tagNames
|
{tagNames?.map((tagName) => <Th key={tagName}>{tagName}</Th>)}
|
||||||
?.filter((tagName) => visibleColumns.has(tagName))
|
<Th isNumeric>Duration</Th>
|
||||||
.map((tagName) => (
|
<Th isNumeric>Input tokens</Th>
|
||||||
<Th key={tagName} textTransform={"none"}>
|
<Th isNumeric>Output tokens</Th>
|
||||||
{tagName}
|
<Th isNumeric>Status</Th>
|
||||||
</Th>
|
|
||||||
))}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.DURATION) && <Th isNumeric>Duration</Th>}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.INPUT_TOKENS) && <Th isNumeric>Input tokens</Th>}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.OUTPUT_TOKENS) && <Th isNumeric>Output tokens</Th>}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.STATUS_CODE) && <Th isNumeric>Status</Th>}
|
|
||||||
</Tr>
|
</Tr>
|
||||||
</Thead>
|
</Thead>
|
||||||
);
|
);
|
||||||
@@ -81,12 +73,12 @@ export const TableRow = ({
|
|||||||
loggedCall,
|
loggedCall,
|
||||||
isExpanded,
|
isExpanded,
|
||||||
onToggle,
|
onToggle,
|
||||||
showOptions,
|
showCheckbox,
|
||||||
}: {
|
}: {
|
||||||
loggedCall: LoggedCall;
|
loggedCall: LoggedCall;
|
||||||
isExpanded: boolean;
|
isExpanded: boolean;
|
||||||
onToggle: () => void;
|
onToggle: () => void;
|
||||||
showOptions?: boolean;
|
showCheckbox?: boolean;
|
||||||
}) => {
|
}) => {
|
||||||
const isError = loggedCall.modelResponse?.statusCode !== 200;
|
const isError = loggedCall.modelResponse?.statusCode !== 200;
|
||||||
const requestedAt = dayjs(loggedCall.requestedAt).format("MMMM D h:mm A");
|
const requestedAt = dayjs(loggedCall.requestedAt).format("MMMM D h:mm A");
|
||||||
@@ -96,14 +88,6 @@ export const TableRow = ({
|
|||||||
const toggleChecked = useAppStore((s) => s.selectedLogs.toggleSelectedLogId);
|
const toggleChecked = useAppStore((s) => s.selectedLogs.toggleSelectedLogId);
|
||||||
|
|
||||||
const tagNames = useTagNames().data;
|
const tagNames = useTagNames().data;
|
||||||
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
|
|
||||||
|
|
||||||
const visibleTagNames = useMemo(() => {
|
|
||||||
return tagNames?.filter((tagName) => visibleColumns.has(tagName)) ?? [];
|
|
||||||
}, [tagNames, visibleColumns]);
|
|
||||||
|
|
||||||
const isClientRehydrated = useIsClientRehydrated();
|
|
||||||
if (!isClientRehydrated) return null;
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
@@ -116,64 +100,50 @@ export const TableRow = ({
|
|||||||
}}
|
}}
|
||||||
fontSize="sm"
|
fontSize="sm"
|
||||||
>
|
>
|
||||||
{showOptions && (
|
{showCheckbox && (
|
||||||
<Td>
|
<Td>
|
||||||
<Checkbox isChecked={isChecked} onChange={() => toggleChecked(loggedCall.id)} />
|
<Checkbox isChecked={isChecked} onChange={() => toggleChecked(loggedCall.id)} />
|
||||||
</Td>
|
</Td>
|
||||||
)}
|
)}
|
||||||
{visibleColumns.has(StaticColumnKeys.SENT_AT) && (
|
<Td>
|
||||||
<Td>
|
<Tooltip label={fullTime} placement="top">
|
||||||
<Tooltip label={fullTime} placement="top">
|
<Box whiteSpace="nowrap" minW="120px">
|
||||||
<Box whiteSpace="nowrap" minW="120px">
|
{requestedAt}
|
||||||
{requestedAt}
|
</Box>
|
||||||
</Box>
|
</Tooltip>
|
||||||
</Tooltip>
|
</Td>
|
||||||
</Td>
|
<Td>
|
||||||
)}
|
<HStack justifyContent="flex-start">
|
||||||
{visibleColumns.has(StaticColumnKeys.MODEL) && (
|
<Text
|
||||||
<Td>
|
colorScheme="purple"
|
||||||
<HStack justifyContent="flex-start">
|
color="purple.500"
|
||||||
<Text
|
borderColor="purple.500"
|
||||||
colorScheme="purple"
|
px={1}
|
||||||
color="purple.500"
|
borderRadius={4}
|
||||||
borderColor="purple.500"
|
borderWidth={1}
|
||||||
px={1}
|
fontSize="xs"
|
||||||
borderRadius={4}
|
whiteSpace="nowrap"
|
||||||
borderWidth={1}
|
>
|
||||||
fontSize="xs"
|
{loggedCall.model}
|
||||||
whiteSpace="nowrap"
|
</Text>
|
||||||
>
|
</HStack>
|
||||||
{loggedCall.model}
|
</Td>
|
||||||
</Text>
|
{tagNames?.map((tagName) => <Td key={tagName}>{loggedCall.tags[tagName]}</Td>)}
|
||||||
</HStack>
|
<Td isNumeric>
|
||||||
</Td>
|
{loggedCall.cacheHit ? (
|
||||||
)}
|
<Text color="gray.500">Cached</Text>
|
||||||
{visibleTagNames.map((tagName) => (
|
) : (
|
||||||
<Td key={tagName}>{loggedCall.tags[tagName]}</Td>
|
((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"
|
||||||
))}
|
)}
|
||||||
{visibleColumns.has(StaticColumnKeys.DURATION) && (
|
</Td>
|
||||||
<Td isNumeric>
|
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
|
||||||
{loggedCall.cacheHit ? (
|
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
|
||||||
<Text color="gray.500">Cached</Text>
|
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
|
||||||
) : (
|
{loggedCall.modelResponse?.statusCode ?? "No response"}
|
||||||
((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"
|
</Td>
|
||||||
)}
|
|
||||||
</Td>
|
|
||||||
)}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.INPUT_TOKENS) && (
|
|
||||||
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
|
|
||||||
)}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.OUTPUT_TOKENS) && (
|
|
||||||
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
|
|
||||||
)}
|
|
||||||
{visibleColumns.has(StaticColumnKeys.STATUS_CODE) && (
|
|
||||||
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
|
|
||||||
{loggedCall.modelResponse?.statusCode ?? "No response"}
|
|
||||||
</Td>
|
|
||||||
)}
|
|
||||||
</Tr>
|
</Tr>
|
||||||
<Tr>
|
<Tr>
|
||||||
<Td colSpan={visibleColumns.size + 1} w="full" p={0}>
|
<Td colSpan={8} p={0}>
|
||||||
<Collapse in={isExpanded} unmountOnExit={true}>
|
<Collapse in={isExpanded} unmountOnExit={true}>
|
||||||
<VStack p={4} align="stretch">
|
<VStack p={4} align="stretch">
|
||||||
<HStack align="stretch">
|
<HStack align="stretch">
|
||||||
|
|||||||
@@ -26,14 +26,6 @@ export const env = createEnv({
|
|||||||
SMTP_PORT: z.string().default("placeholder"),
|
SMTP_PORT: z.string().default("placeholder"),
|
||||||
SMTP_LOGIN: z.string().default("placeholder"),
|
SMTP_LOGIN: z.string().default("placeholder"),
|
||||||
SMTP_PASSWORD: z.string().default("placeholder"),
|
SMTP_PASSWORD: z.string().default("placeholder"),
|
||||||
WORKER_CONCURRENCY: z
|
|
||||||
.string()
|
|
||||||
.default("10")
|
|
||||||
.transform((val) => parseInt(val)),
|
|
||||||
WORKER_MAX_POOL_SIZE: z
|
|
||||||
.string()
|
|
||||||
.default("10")
|
|
||||||
.transform((val) => parseInt(val)),
|
|
||||||
},
|
},
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -46,6 +38,8 @@ export const env = createEnv({
|
|||||||
NEXT_PUBLIC_SOCKET_URL: z.string().url().default("http://localhost:3318"),
|
NEXT_PUBLIC_SOCKET_URL: z.string().url().default("http://localhost:3318"),
|
||||||
NEXT_PUBLIC_HOST: z.string().url().default("http://localhost:3000"),
|
NEXT_PUBLIC_HOST: z.string().url().default("http://localhost:3000"),
|
||||||
NEXT_PUBLIC_SENTRY_DSN: z.string().optional(),
|
NEXT_PUBLIC_SENTRY_DSN: z.string().optional(),
|
||||||
|
NEXT_PUBLIC_SHOW_DATA: z.string().optional(),
|
||||||
|
NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS: z.string().optional(),
|
||||||
},
|
},
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -60,6 +54,7 @@ export const env = createEnv({
|
|||||||
NEXT_PUBLIC_POSTHOG_KEY: process.env.NEXT_PUBLIC_POSTHOG_KEY,
|
NEXT_PUBLIC_POSTHOG_KEY: process.env.NEXT_PUBLIC_POSTHOG_KEY,
|
||||||
NEXT_PUBLIC_SOCKET_URL: process.env.NEXT_PUBLIC_SOCKET_URL,
|
NEXT_PUBLIC_SOCKET_URL: process.env.NEXT_PUBLIC_SOCKET_URL,
|
||||||
NEXT_PUBLIC_HOST: process.env.NEXT_PUBLIC_HOST,
|
NEXT_PUBLIC_HOST: process.env.NEXT_PUBLIC_HOST,
|
||||||
|
NEXT_PUBLIC_SHOW_DATA: process.env.NEXT_PUBLIC_SHOW_DATA,
|
||||||
GITHUB_CLIENT_ID: process.env.GITHUB_CLIENT_ID,
|
GITHUB_CLIENT_ID: process.env.GITHUB_CLIENT_ID,
|
||||||
GITHUB_CLIENT_SECRET: process.env.GITHUB_CLIENT_SECRET,
|
GITHUB_CLIENT_SECRET: process.env.GITHUB_CLIENT_SECRET,
|
||||||
REPLICATE_API_TOKEN: process.env.REPLICATE_API_TOKEN,
|
REPLICATE_API_TOKEN: process.env.REPLICATE_API_TOKEN,
|
||||||
@@ -67,13 +62,12 @@ export const env = createEnv({
|
|||||||
NEXT_PUBLIC_SENTRY_DSN: process.env.NEXT_PUBLIC_SENTRY_DSN,
|
NEXT_PUBLIC_SENTRY_DSN: process.env.NEXT_PUBLIC_SENTRY_DSN,
|
||||||
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
|
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
|
||||||
OPENPIPE_API_KEY: process.env.OPENPIPE_API_KEY,
|
OPENPIPE_API_KEY: process.env.OPENPIPE_API_KEY,
|
||||||
|
NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS: process.env.NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS,
|
||||||
SENDER_EMAIL: process.env.SENDER_EMAIL,
|
SENDER_EMAIL: process.env.SENDER_EMAIL,
|
||||||
SMTP_HOST: process.env.SMTP_HOST,
|
SMTP_HOST: process.env.SMTP_HOST,
|
||||||
SMTP_PORT: process.env.SMTP_PORT,
|
SMTP_PORT: process.env.SMTP_PORT,
|
||||||
SMTP_LOGIN: process.env.SMTP_LOGIN,
|
SMTP_LOGIN: process.env.SMTP_LOGIN,
|
||||||
SMTP_PASSWORD: process.env.SMTP_PASSWORD,
|
SMTP_PASSWORD: process.env.SMTP_PASSWORD,
|
||||||
WORKER_CONCURRENCY: process.env.WORKER_CONCURRENCY,
|
|
||||||
WORKER_MAX_POOL_SIZE: process.env.WORKER_MAX_POOL_SIZE,
|
|
||||||
},
|
},
|
||||||
/**
|
/**
|
||||||
* Run `build` or `dev` with `SKIP_ENV_VALIDATION` to skip env validation.
|
* Run `build` or `dev` with `SKIP_ENV_VALIDATION` to skip env validation.
|
||||||
|
|||||||
@@ -16,16 +16,7 @@ export async function getCompletion(
|
|||||||
try {
|
try {
|
||||||
if (onStream) {
|
if (onStream) {
|
||||||
const resp = await openai.chat.completions.create(
|
const resp = await openai.chat.completions.create(
|
||||||
{
|
{ ...input, stream: true },
|
||||||
...input,
|
|
||||||
stream: true,
|
|
||||||
openpipe: {
|
|
||||||
tags: {
|
|
||||||
prompt_id: "getCompletion",
|
|
||||||
stream: "true",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
maxRetries: 0,
|
maxRetries: 0,
|
||||||
},
|
},
|
||||||
@@ -43,16 +34,7 @@ export async function getCompletion(
|
|||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
const resp = await openai.chat.completions.create(
|
const resp = await openai.chat.completions.create(
|
||||||
{
|
{ ...input, stream: false },
|
||||||
...input,
|
|
||||||
stream: false,
|
|
||||||
openpipe: {
|
|
||||||
tags: {
|
|
||||||
prompt_id: "getCompletion",
|
|
||||||
stream: "false",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
maxRetries: 0,
|
maxRetries: 0,
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -33,7 +33,7 @@ export default function Dashboard() {
|
|||||||
);
|
);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<AppShell title="Dashboard" requireAuth requireBeta>
|
<AppShell title="Dashboard" requireAuth>
|
||||||
<VStack px={8} py={8} alignItems="flex-start" spacing={4}>
|
<VStack px={8} py={8} alignItems="flex-start" spacing={4}>
|
||||||
<Text fontSize="2xl" fontWeight="bold">
|
<Text fontSize="2xl" fontWeight="bold">
|
||||||
Dashboard
|
Dashboard
|
||||||
|
|||||||
97
app/src/pages/data/[id].tsx
Normal file
97
app/src/pages/data/[id].tsx
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
import {
|
||||||
|
Box,
|
||||||
|
Breadcrumb,
|
||||||
|
BreadcrumbItem,
|
||||||
|
Center,
|
||||||
|
Flex,
|
||||||
|
Icon,
|
||||||
|
Input,
|
||||||
|
VStack,
|
||||||
|
} from "@chakra-ui/react";
|
||||||
|
import Link from "next/link";
|
||||||
|
|
||||||
|
import { useRouter } from "next/router";
|
||||||
|
import { useState, useEffect } from "react";
|
||||||
|
import { RiDatabase2Line } from "react-icons/ri";
|
||||||
|
import AppShell from "~/components/nav/AppShell";
|
||||||
|
import { api } from "~/utils/api";
|
||||||
|
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
|
||||||
|
import DatasetEntriesTable from "~/components/datasets/DatasetEntriesTable";
|
||||||
|
import { DatasetHeaderButtons } from "~/components/datasets/DatasetHeaderButtons/DatasetHeaderButtons";
|
||||||
|
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
|
||||||
|
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
|
||||||
|
|
||||||
|
export default function Dataset() {
|
||||||
|
const router = useRouter();
|
||||||
|
const utils = api.useContext();
|
||||||
|
|
||||||
|
const dataset = useDataset();
|
||||||
|
const datasetId = router.query.id as string;
|
||||||
|
|
||||||
|
const [name, setName] = useState(dataset.data?.name || "");
|
||||||
|
useEffect(() => {
|
||||||
|
setName(dataset.data?.name || "");
|
||||||
|
}, [dataset.data?.name]);
|
||||||
|
|
||||||
|
const updateMutation = api.datasets.update.useMutation();
|
||||||
|
const [onSaveName] = useHandledAsyncCallback(async () => {
|
||||||
|
if (name && name !== dataset.data?.name && dataset.data?.id) {
|
||||||
|
await updateMutation.mutateAsync({
|
||||||
|
id: dataset.data.id,
|
||||||
|
updates: { name: name },
|
||||||
|
});
|
||||||
|
await Promise.all([utils.datasets.list.invalidate(), utils.datasets.get.invalidate()]);
|
||||||
|
}
|
||||||
|
}, [updateMutation, dataset.data?.id, dataset.data?.name, name]);
|
||||||
|
|
||||||
|
if (!dataset.isLoading && !dataset.data) {
|
||||||
|
return (
|
||||||
|
<AppShell title="Dataset not found">
|
||||||
|
<Center h="100%">
|
||||||
|
<div>Dataset not found 😕</div>
|
||||||
|
</Center>
|
||||||
|
</AppShell>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<AppShell title={dataset.data?.name}>
|
||||||
|
<VStack h="full">
|
||||||
|
<PageHeaderContainer>
|
||||||
|
<Breadcrumb>
|
||||||
|
<BreadcrumbItem>
|
||||||
|
<ProjectBreadcrumbContents projectName={dataset.data?.project?.name} />
|
||||||
|
</BreadcrumbItem>
|
||||||
|
<BreadcrumbItem>
|
||||||
|
<Link href="/data">
|
||||||
|
<Flex alignItems="center" _hover={{ textDecoration: "underline" }}>
|
||||||
|
<Icon as={RiDatabase2Line} boxSize={4} mr={2} /> Datasets
|
||||||
|
</Flex>
|
||||||
|
</Link>
|
||||||
|
</BreadcrumbItem>
|
||||||
|
<BreadcrumbItem isCurrentPage>
|
||||||
|
<Input
|
||||||
|
size="sm"
|
||||||
|
value={name}
|
||||||
|
onChange={(e) => setName(e.target.value)}
|
||||||
|
onBlur={onSaveName}
|
||||||
|
borderWidth={1}
|
||||||
|
borderColor="transparent"
|
||||||
|
fontSize={16}
|
||||||
|
px={0}
|
||||||
|
minW={{ base: 100, lg: 300 }}
|
||||||
|
flex={1}
|
||||||
|
_hover={{ borderColor: "gray.300" }}
|
||||||
|
_focus={{ borderColor: "blue.500", outline: "none" }}
|
||||||
|
/>
|
||||||
|
</BreadcrumbItem>
|
||||||
|
</Breadcrumb>
|
||||||
|
<DatasetHeaderButtons />
|
||||||
|
</PageHeaderContainer>
|
||||||
|
<Box w="full" overflowX="auto" flex={1} px={8} pt={8} pb={16}>
|
||||||
|
{datasetId && <DatasetEntriesTable />}
|
||||||
|
</Box>
|
||||||
|
</VStack>
|
||||||
|
</AppShell>
|
||||||
|
);
|
||||||
|
}
|
||||||
49
app/src/pages/data/index.tsx
Normal file
49
app/src/pages/data/index.tsx
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import { SimpleGrid, Icon, Breadcrumb, BreadcrumbItem, Flex } from "@chakra-ui/react";
|
||||||
|
import AppShell from "~/components/nav/AppShell";
|
||||||
|
import { RiDatabase2Line } from "react-icons/ri";
|
||||||
|
import {
|
||||||
|
DatasetCard,
|
||||||
|
DatasetCardSkeleton,
|
||||||
|
NewDatasetCard,
|
||||||
|
} from "~/components/datasets/DatasetCard";
|
||||||
|
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
|
||||||
|
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
|
||||||
|
import { useDatasets } from "~/utils/hooks";
|
||||||
|
|
||||||
|
export default function DatasetsPage() {
|
||||||
|
const datasets = useDatasets();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<AppShell title="Data" requireAuth>
|
||||||
|
<PageHeaderContainer>
|
||||||
|
<Breadcrumb>
|
||||||
|
<BreadcrumbItem>
|
||||||
|
<ProjectBreadcrumbContents />
|
||||||
|
</BreadcrumbItem>
|
||||||
|
<BreadcrumbItem minH={8}>
|
||||||
|
<Flex alignItems="center">
|
||||||
|
<Icon as={RiDatabase2Line} boxSize={4} mr={2} /> Datasets
|
||||||
|
</Flex>
|
||||||
|
</BreadcrumbItem>
|
||||||
|
</Breadcrumb>
|
||||||
|
</PageHeaderContainer>
|
||||||
|
<SimpleGrid w="full" columns={{ base: 1, md: 2, lg: 3, xl: 4 }} spacing={8} py={4} px={8}>
|
||||||
|
<NewDatasetCard />
|
||||||
|
{datasets.data && !datasets.isLoading ? (
|
||||||
|
datasets?.data?.map((dataset) => (
|
||||||
|
<DatasetCard
|
||||||
|
key={dataset.id}
|
||||||
|
dataset={{ ...dataset, numEntries: dataset._count.datasetEntries }}
|
||||||
|
/>
|
||||||
|
))
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<DatasetCardSkeleton />
|
||||||
|
<DatasetCardSkeleton />
|
||||||
|
<DatasetCardSkeleton />
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</SimpleGrid>
|
||||||
|
</AppShell>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
import { Text, VStack, Divider } from "@chakra-ui/react";
|
|
||||||
import FineTunesTable from "~/components/fineTunes/FineTunesTable";
|
|
||||||
|
|
||||||
import AppShell from "~/components/nav/AppShell";
|
|
||||||
|
|
||||||
export default function FineTunes() {
|
|
||||||
return (
|
|
||||||
<AppShell title="Fine Tunes" requireAuth requireBeta>
|
|
||||||
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
|
|
||||||
<Text fontSize="2xl" fontWeight="bold">
|
|
||||||
Fine Tunes
|
|
||||||
</Text>
|
|
||||||
<Divider />
|
|
||||||
<FineTunesTable />
|
|
||||||
</VStack>
|
|
||||||
</AppShell>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
import { useState } from "react";
|
import { useState } from "react";
|
||||||
import { Text, VStack, Divider, HStack, Box } from "@chakra-ui/react";
|
import { Text, VStack, Divider, HStack } from "@chakra-ui/react";
|
||||||
|
|
||||||
import AppShell from "~/components/nav/AppShell";
|
import AppShell from "~/components/nav/AppShell";
|
||||||
import LoggedCallTable from "~/components/requestLogs/LoggedCallsTable";
|
import LoggedCallTable from "~/components/requestLogs/LoggedCallsTable";
|
||||||
@@ -9,9 +9,6 @@ import { useAppStore } from "~/state/store";
|
|||||||
import { RiFlaskLine } from "react-icons/ri";
|
import { RiFlaskLine } from "react-icons/ri";
|
||||||
import { FiFilter } from "react-icons/fi";
|
import { FiFilter } from "react-icons/fi";
|
||||||
import LogFilters from "~/components/requestLogs/LogFilters/LogFilters";
|
import LogFilters from "~/components/requestLogs/LogFilters/LogFilters";
|
||||||
import ColumnVisiblityDropdown from "~/components/requestLogs/ColumnVisiblityDropdown";
|
|
||||||
import FineTuneButton from "~/components/requestLogs/FineTuneButton";
|
|
||||||
import ExportButton from "~/components/requestLogs/ExportButton";
|
|
||||||
|
|
||||||
export default function LoggedCalls() {
|
export default function LoggedCalls() {
|
||||||
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
|
||||||
@@ -19,38 +16,33 @@ export default function LoggedCalls() {
|
|||||||
const [filtersShown, setFiltersShown] = useState(true);
|
const [filtersShown, setFiltersShown] = useState(true);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<AppShell title="Request Logs" requireAuth requireBeta>
|
<AppShell title="Request Logs" requireAuth>
|
||||||
<Box h="100vh" overflowY="scroll">
|
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
|
||||||
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
|
<Text fontSize="2xl" fontWeight="bold">
|
||||||
<Text fontSize="2xl" fontWeight="bold">
|
Request Logs
|
||||||
Request Logs
|
</Text>
|
||||||
</Text>
|
<Divider />
|
||||||
<Divider />
|
<HStack w="full" justifyContent="flex-end">
|
||||||
<HStack w="full" justifyContent="flex-end">
|
<ActionButton
|
||||||
<FineTuneButton />
|
onClick={() => {
|
||||||
<ActionButton
|
setFiltersShown(!filtersShown);
|
||||||
onClick={() => {
|
}}
|
||||||
console.log("experimenting with these ids", selectedLogIds);
|
label={filtersShown ? "Hide Filters" : "Show Filters"}
|
||||||
}}
|
icon={FiFilter}
|
||||||
label="Experiment"
|
/>
|
||||||
icon={RiFlaskLine}
|
<ActionButton
|
||||||
isDisabled={selectedLogIds.size === 0}
|
onClick={() => {
|
||||||
/>
|
console.log("experimenting with these ids", selectedLogIds);
|
||||||
<ExportButton />
|
}}
|
||||||
<ColumnVisiblityDropdown />
|
label="Experiment"
|
||||||
<ActionButton
|
icon={RiFlaskLine}
|
||||||
onClick={() => {
|
isDisabled={selectedLogIds.size === 0}
|
||||||
setFiltersShown(!filtersShown);
|
/>
|
||||||
}}
|
</HStack>
|
||||||
label={filtersShown ? "Hide Filters" : "Show Filters"}
|
{filtersShown && <LogFilters />}
|
||||||
icon={FiFilter}
|
<LoggedCallTable />
|
||||||
/>
|
<LoggedCallsPaginator />
|
||||||
</HStack>
|
</VStack>
|
||||||
{filtersShown && <LogFilters />}
|
|
||||||
<LoggedCallTable />
|
|
||||||
<LoggedCallsPaginator />
|
|
||||||
</VStack>
|
|
||||||
</Box>
|
|
||||||
</AppShell>
|
</AppShell>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
108
app/src/server/api/autogenerate/autogenerateDatasetEntries.ts
Normal file
108
app/src/server/api/autogenerate/autogenerateDatasetEntries.ts
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
import { type ChatCompletion } from "openai/resources/chat";
|
||||||
|
import { openai } from "../../utils/openai";
|
||||||
|
import { isAxiosError } from "./utils";
|
||||||
|
import { type APIResponse } from "openai/core";
|
||||||
|
import { sleep } from "~/server/utils/sleep";
|
||||||
|
|
||||||
|
const MAX_AUTO_RETRIES = 50;
|
||||||
|
const MIN_DELAY = 500; // milliseconds
|
||||||
|
const MAX_DELAY = 15000; // milliseconds
|
||||||
|
|
||||||
|
function calculateDelay(numPreviousTries: number): number {
|
||||||
|
const baseDelay = Math.min(MAX_DELAY, MIN_DELAY * Math.pow(2, numPreviousTries));
|
||||||
|
const jitter = Math.random() * baseDelay;
|
||||||
|
return baseDelay + jitter;
|
||||||
|
}
|
||||||
|
|
||||||
|
const getCompletionWithBackoff = async (
|
||||||
|
getCompletion: () => Promise<APIResponse<ChatCompletion>>,
|
||||||
|
) => {
|
||||||
|
let completion;
|
||||||
|
let tries = 0;
|
||||||
|
while (tries < MAX_AUTO_RETRIES) {
|
||||||
|
try {
|
||||||
|
completion = await getCompletion();
|
||||||
|
break;
|
||||||
|
} catch (e) {
|
||||||
|
if (isAxiosError(e)) {
|
||||||
|
console.error(e?.response?.data?.error?.message);
|
||||||
|
} else {
|
||||||
|
await sleep(calculateDelay(tries));
|
||||||
|
console.error(e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tries++;
|
||||||
|
}
|
||||||
|
return completion;
|
||||||
|
};
|
||||||
|
// TODO: Add seeds to ensure batches don't contain duplicate data
|
||||||
|
const MAX_BATCH_SIZE = 5;
|
||||||
|
|
||||||
|
export const autogenerateDatasetEntries = async (
|
||||||
|
numToGenerate: number,
|
||||||
|
inputDescription: string,
|
||||||
|
outputDescription: string,
|
||||||
|
): Promise<{ input: string; output: string }[]> => {
|
||||||
|
const batchSizes = Array.from({ length: Math.ceil(numToGenerate / MAX_BATCH_SIZE) }, (_, i) =>
|
||||||
|
i === Math.ceil(numToGenerate / MAX_BATCH_SIZE) - 1 && numToGenerate % MAX_BATCH_SIZE
|
||||||
|
? numToGenerate % MAX_BATCH_SIZE
|
||||||
|
: MAX_BATCH_SIZE,
|
||||||
|
);
|
||||||
|
|
||||||
|
const getCompletion = (batchSize: number) =>
|
||||||
|
openai.chat.completions.create({
|
||||||
|
model: "gpt-4",
|
||||||
|
messages: [
|
||||||
|
{
|
||||||
|
role: "system",
|
||||||
|
content: `The user needs ${batchSize} rows of data, each with an input and an output.\n---\n The input should follow these requirements: ${inputDescription}\n---\n The output should follow these requirements: ${outputDescription}`,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
functions: [
|
||||||
|
{
|
||||||
|
name: "add_list_of_data",
|
||||||
|
description: "Add a list of data to the database",
|
||||||
|
parameters: {
|
||||||
|
type: "object",
|
||||||
|
properties: {
|
||||||
|
rows: {
|
||||||
|
type: "array",
|
||||||
|
description: "The rows of data that match the description",
|
||||||
|
items: {
|
||||||
|
type: "object",
|
||||||
|
properties: {
|
||||||
|
input: {
|
||||||
|
type: "string",
|
||||||
|
description: "The input for this row",
|
||||||
|
},
|
||||||
|
output: {
|
||||||
|
type: "string",
|
||||||
|
description: "The output for this row",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
|
||||||
|
function_call: { name: "add_list_of_data" },
|
||||||
|
temperature: 0.5,
|
||||||
|
});
|
||||||
|
|
||||||
|
const completionCallbacks = batchSizes.map((batchSize) =>
|
||||||
|
getCompletionWithBackoff(() => getCompletion(batchSize)),
|
||||||
|
);
|
||||||
|
|
||||||
|
const completions = await Promise.all(completionCallbacks);
|
||||||
|
|
||||||
|
const rows = completions.flatMap((completion) => {
|
||||||
|
const parsed = JSON.parse(
|
||||||
|
completion?.choices[0]?.message?.function_call?.arguments ?? "{rows: []}",
|
||||||
|
) as { rows: { input: string; output: string }[] };
|
||||||
|
return parsed.rows;
|
||||||
|
});
|
||||||
|
|
||||||
|
return rows;
|
||||||
|
};
|
||||||
@@ -98,11 +98,6 @@ export const autogenerateScenarioValues = async (
|
|||||||
|
|
||||||
function_call: { name: "add_scenario" },
|
function_call: { name: "add_scenario" },
|
||||||
temperature: 0.5,
|
temperature: 0.5,
|
||||||
openpipe: {
|
|
||||||
tags: {
|
|
||||||
prompt_id: "autogenerateScenarioValues",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
});
|
});
|
||||||
|
|
||||||
const parsed = JSON.parse(
|
const parsed = JSON.parse(
|
||||||
|
|||||||
13
app/src/server/api/external/v1Api.router.ts
vendored
13
app/src/server/api/external/v1Api.router.ts
vendored
@@ -66,7 +66,7 @@ export const v1ApiRouter = createOpenApiRouter({
|
|||||||
|
|
||||||
if (!existingResponse) return { respPayload: null };
|
if (!existingResponse) return { respPayload: null };
|
||||||
|
|
||||||
const newCall = await prisma.loggedCall.create({
|
await prisma.loggedCall.create({
|
||||||
data: {
|
data: {
|
||||||
projectId: ctx.key.projectId,
|
projectId: ctx.key.projectId,
|
||||||
requestedAt: new Date(input.requestedAt),
|
requestedAt: new Date(input.requestedAt),
|
||||||
@@ -75,7 +75,11 @@ export const v1ApiRouter = createOpenApiRouter({
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
await createTags(newCall.projectId, newCall.id, input.tags);
|
await createTags(
|
||||||
|
existingResponse.originalLoggedCall.projectId,
|
||||||
|
existingResponse.originalLoggedCallId,
|
||||||
|
input.tags,
|
||||||
|
);
|
||||||
return {
|
return {
|
||||||
respPayload: existingResponse.respPayload,
|
respPayload: existingResponse.respPayload,
|
||||||
};
|
};
|
||||||
@@ -107,7 +111,7 @@ export const v1ApiRouter = createOpenApiRouter({
|
|||||||
.default({}),
|
.default({}),
|
||||||
}),
|
}),
|
||||||
)
|
)
|
||||||
.output(z.object({ status: z.union([z.literal("ok"), z.literal("error")]) }))
|
.output(z.object({ status: z.literal("ok") }))
|
||||||
.mutation(async ({ input, ctx }) => {
|
.mutation(async ({ input, ctx }) => {
|
||||||
const reqPayload = await reqValidator.spa(input.reqPayload);
|
const reqPayload = await reqValidator.spa(input.reqPayload);
|
||||||
const respPayload = await respValidator.spa(input.respPayload);
|
const respPayload = await respValidator.spa(input.respPayload);
|
||||||
@@ -208,7 +212,6 @@ export const v1ApiRouter = createOpenApiRouter({
|
|||||||
createdAt: true,
|
createdAt: true,
|
||||||
cacheHit: true,
|
cacheHit: true,
|
||||||
tags: true,
|
tags: true,
|
||||||
id: true,
|
|
||||||
modelResponse: {
|
modelResponse: {
|
||||||
select: {
|
select: {
|
||||||
id: true,
|
id: true,
|
||||||
@@ -234,7 +237,7 @@ async function createTags(projectId: string, loggedCallId: string, tags: Record<
|
|||||||
const tagsToCreate = Object.entries(tags).map(([name, value]) => ({
|
const tagsToCreate = Object.entries(tags).map(([name, value]) => ({
|
||||||
projectId,
|
projectId,
|
||||||
loggedCallId,
|
loggedCallId,
|
||||||
name: name.replaceAll(/[^a-zA-Z0-9_$.]/g, "_"),
|
name: name.replaceAll(/[^a-zA-Z0-9_$]/g, "_"),
|
||||||
value,
|
value,
|
||||||
}));
|
}));
|
||||||
await prisma.loggedCallTag.createMany({
|
await prisma.loggedCallTag.createMany({
|
||||||
|
|||||||
@@ -6,10 +6,11 @@ import { scenarioVariantCellsRouter } from "./routers/scenarioVariantCells.route
|
|||||||
import { scenarioVarsRouter } from "./routers/scenarioVariables.router";
|
import { scenarioVarsRouter } from "./routers/scenarioVariables.router";
|
||||||
import { evaluationsRouter } from "./routers/evaluations.router";
|
import { evaluationsRouter } from "./routers/evaluations.router";
|
||||||
import { worldChampsRouter } from "./routers/worldChamps.router";
|
import { worldChampsRouter } from "./routers/worldChamps.router";
|
||||||
|
import { datasetsRouter } from "./routers/datasets.router";
|
||||||
|
import { datasetEntries } from "./routers/datasetEntries.router";
|
||||||
import { projectsRouter } from "./routers/projects.router";
|
import { projectsRouter } from "./routers/projects.router";
|
||||||
import { dashboardRouter } from "./routers/dashboard.router";
|
import { dashboardRouter } from "./routers/dashboard.router";
|
||||||
import { loggedCallsRouter } from "./routers/loggedCalls.router";
|
import { loggedCallsRouter } from "./routers/loggedCalls.router";
|
||||||
import { fineTunesRouter } from "./routers/fineTunes.router";
|
|
||||||
import { usersRouter } from "./routers/users.router";
|
import { usersRouter } from "./routers/users.router";
|
||||||
import { adminJobsRouter } from "./routers/adminJobs.router";
|
import { adminJobsRouter } from "./routers/adminJobs.router";
|
||||||
|
|
||||||
@@ -26,10 +27,11 @@ export const appRouter = createTRPCRouter({
|
|||||||
scenarioVars: scenarioVarsRouter,
|
scenarioVars: scenarioVarsRouter,
|
||||||
evaluations: evaluationsRouter,
|
evaluations: evaluationsRouter,
|
||||||
worldChamps: worldChampsRouter,
|
worldChamps: worldChampsRouter,
|
||||||
|
datasets: datasetsRouter,
|
||||||
|
datasetEntries: datasetEntries,
|
||||||
projects: projectsRouter,
|
projects: projectsRouter,
|
||||||
dashboard: dashboardRouter,
|
dashboard: dashboardRouter,
|
||||||
loggedCalls: loggedCallsRouter,
|
loggedCalls: loggedCallsRouter,
|
||||||
fineTunes: fineTunesRouter,
|
|
||||||
users: usersRouter,
|
users: usersRouter,
|
||||||
adminJobs: adminJobsRouter,
|
adminJobs: adminJobsRouter,
|
||||||
});
|
});
|
||||||
|
|||||||
145
app/src/server/api/routers/datasetEntries.router.ts
Normal file
145
app/src/server/api/routers/datasetEntries.router.ts
Normal file
@@ -0,0 +1,145 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
|
||||||
|
import { prisma } from "~/server/db";
|
||||||
|
import { requireCanModifyDataset, requireCanViewDataset } from "~/utils/accessControl";
|
||||||
|
import { autogenerateDatasetEntries } from "../autogenerate/autogenerateDatasetEntries";
|
||||||
|
|
||||||
|
export const datasetEntries = createTRPCRouter({
|
||||||
|
list: protectedProcedure
|
||||||
|
.input(z.object({ datasetId: z.string(), page: z.number(), pageSize: z.number() }))
|
||||||
|
.query(async ({ input, ctx }) => {
|
||||||
|
await requireCanViewDataset(input.datasetId, ctx);
|
||||||
|
|
||||||
|
const { datasetId, page, pageSize } = input;
|
||||||
|
|
||||||
|
const entries = await prisma.datasetEntry.findMany({
|
||||||
|
where: {
|
||||||
|
datasetId,
|
||||||
|
},
|
||||||
|
orderBy: { createdAt: "desc" },
|
||||||
|
skip: (page - 1) * pageSize,
|
||||||
|
take: pageSize,
|
||||||
|
});
|
||||||
|
|
||||||
|
const count = await prisma.datasetEntry.count({
|
||||||
|
where: {
|
||||||
|
datasetId,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
entries,
|
||||||
|
count,
|
||||||
|
};
|
||||||
|
}),
|
||||||
|
createOne: protectedProcedure
|
||||||
|
.input(
|
||||||
|
z.object({
|
||||||
|
datasetId: z.string(),
|
||||||
|
input: z.string(),
|
||||||
|
output: z.string().optional(),
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
await requireCanModifyDataset(input.datasetId, ctx);
|
||||||
|
|
||||||
|
return await prisma.datasetEntry.create({
|
||||||
|
data: {
|
||||||
|
datasetId: input.datasetId,
|
||||||
|
input: input.input,
|
||||||
|
output: input.output,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
|
||||||
|
autogenerateEntries: protectedProcedure
|
||||||
|
.input(
|
||||||
|
z.object({
|
||||||
|
datasetId: z.string(),
|
||||||
|
numToGenerate: z.number(),
|
||||||
|
inputDescription: z.string(),
|
||||||
|
outputDescription: z.string(),
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
await requireCanModifyDataset(input.datasetId, ctx);
|
||||||
|
|
||||||
|
const dataset = await prisma.dataset.findUnique({
|
||||||
|
where: {
|
||||||
|
id: input.datasetId,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!dataset) {
|
||||||
|
throw new Error(`Dataset with id ${input.datasetId} does not exist`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const entries = await autogenerateDatasetEntries(
|
||||||
|
input.numToGenerate,
|
||||||
|
input.inputDescription,
|
||||||
|
input.outputDescription,
|
||||||
|
);
|
||||||
|
|
||||||
|
const createdEntries = await prisma.datasetEntry.createMany({
|
||||||
|
data: entries.map((entry) => ({
|
||||||
|
datasetId: input.datasetId,
|
||||||
|
input: entry.input,
|
||||||
|
output: entry.output,
|
||||||
|
})),
|
||||||
|
});
|
||||||
|
|
||||||
|
return createdEntries;
|
||||||
|
}),
|
||||||
|
|
||||||
|
delete: protectedProcedure
|
||||||
|
.input(z.object({ id: z.string() }))
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
const datasetId = (
|
||||||
|
await prisma.datasetEntry.findUniqueOrThrow({
|
||||||
|
where: { id: input.id },
|
||||||
|
})
|
||||||
|
).datasetId;
|
||||||
|
|
||||||
|
await requireCanModifyDataset(datasetId, ctx);
|
||||||
|
|
||||||
|
return await prisma.datasetEntry.delete({
|
||||||
|
where: {
|
||||||
|
id: input.id,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
|
||||||
|
update: protectedProcedure
|
||||||
|
.input(
|
||||||
|
z.object({
|
||||||
|
id: z.string(),
|
||||||
|
updates: z.object({
|
||||||
|
input: z.string(),
|
||||||
|
output: z.string().optional(),
|
||||||
|
}),
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
const existing = await prisma.datasetEntry.findUnique({
|
||||||
|
where: {
|
||||||
|
id: input.id,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!existing) {
|
||||||
|
throw new Error(`dataEntry with id ${input.id} does not exist`);
|
||||||
|
}
|
||||||
|
|
||||||
|
await requireCanModifyDataset(existing.datasetId, ctx);
|
||||||
|
|
||||||
|
return await prisma.datasetEntry.update({
|
||||||
|
where: {
|
||||||
|
id: input.id,
|
||||||
|
},
|
||||||
|
data: {
|
||||||
|
input: input.updates.input,
|
||||||
|
output: input.updates.output,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
});
|
||||||
88
app/src/server/api/routers/datasets.router.ts
Normal file
88
app/src/server/api/routers/datasets.router.ts
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
import { createTRPCRouter, protectedProcedure, publicProcedure } from "~/server/api/trpc";
|
||||||
|
import { prisma } from "~/server/db";
|
||||||
|
import {
|
||||||
|
requireCanModifyDataset,
|
||||||
|
requireCanModifyProject,
|
||||||
|
requireCanViewDataset,
|
||||||
|
requireCanViewProject,
|
||||||
|
} from "~/utils/accessControl";
|
||||||
|
|
||||||
|
export const datasetsRouter = createTRPCRouter({
|
||||||
|
list: protectedProcedure
|
||||||
|
.input(z.object({ projectId: z.string() }))
|
||||||
|
.query(async ({ input, ctx }) => {
|
||||||
|
await requireCanViewProject(input.projectId, ctx);
|
||||||
|
|
||||||
|
const datasets = await prisma.dataset.findMany({
|
||||||
|
where: {
|
||||||
|
projectId: input.projectId,
|
||||||
|
},
|
||||||
|
orderBy: {
|
||||||
|
createdAt: "desc",
|
||||||
|
},
|
||||||
|
include: {
|
||||||
|
_count: {
|
||||||
|
select: { datasetEntries: true },
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return datasets;
|
||||||
|
}),
|
||||||
|
|
||||||
|
get: publicProcedure.input(z.object({ id: z.string() })).query(async ({ input, ctx }) => {
|
||||||
|
await requireCanViewDataset(input.id, ctx);
|
||||||
|
return await prisma.dataset.findFirstOrThrow({
|
||||||
|
where: { id: input.id },
|
||||||
|
include: {
|
||||||
|
project: true,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
|
||||||
|
create: protectedProcedure
|
||||||
|
.input(z.object({ projectId: z.string() }))
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
await requireCanModifyProject(input.projectId, ctx);
|
||||||
|
|
||||||
|
const numDatasets = await prisma.dataset.count({
|
||||||
|
where: {
|
||||||
|
projectId: input.projectId,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return await prisma.dataset.create({
|
||||||
|
data: {
|
||||||
|
name: `Dataset ${numDatasets + 1}`,
|
||||||
|
projectId: input.projectId,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
|
||||||
|
update: protectedProcedure
|
||||||
|
.input(z.object({ id: z.string(), updates: z.object({ name: z.string() }) }))
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
await requireCanModifyDataset(input.id, ctx);
|
||||||
|
return await prisma.dataset.update({
|
||||||
|
where: {
|
||||||
|
id: input.id,
|
||||||
|
},
|
||||||
|
data: {
|
||||||
|
name: input.updates.name,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
|
||||||
|
delete: protectedProcedure
|
||||||
|
.input(z.object({ id: z.string() }))
|
||||||
|
.mutation(async ({ input, ctx }) => {
|
||||||
|
await requireCanModifyDataset(input.id, ctx);
|
||||||
|
|
||||||
|
await prisma.dataset.delete({
|
||||||
|
where: {
|
||||||
|
id: input.id,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
});
|
||||||
@@ -178,7 +178,6 @@ export const experimentsRouter = createTRPCRouter({
|
|||||||
existingToNewVariantIds.set(variant.id, newVariantId);
|
existingToNewVariantIds.set(variant.id, newVariantId);
|
||||||
variantsToCreate.push({
|
variantsToCreate.push({
|
||||||
...variant,
|
...variant,
|
||||||
uiId: uuidv4(),
|
|
||||||
id: newVariantId,
|
id: newVariantId,
|
||||||
experimentId: newExperimentId,
|
experimentId: newExperimentId,
|
||||||
});
|
});
|
||||||
@@ -192,7 +191,6 @@ export const experimentsRouter = createTRPCRouter({
|
|||||||
scenariosToCreate.push({
|
scenariosToCreate.push({
|
||||||
...scenario,
|
...scenario,
|
||||||
id: newScenarioId,
|
id: newScenarioId,
|
||||||
uiId: uuidv4(),
|
|
||||||
experimentId: newExperimentId,
|
experimentId: newExperimentId,
|
||||||
variableValues: scenario.variableValues as Prisma.InputJsonValue,
|
variableValues: scenario.variableValues as Prisma.InputJsonValue,
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,113 +0,0 @@
|
|||||||
import { z } from "zod";
|
|
||||||
import { v4 as uuidv4 } from "uuid";
|
|
||||||
import { type Prisma } from "@prisma/client";
|
|
||||||
|
|
||||||
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
|
|
||||||
import { prisma } from "~/server/db";
|
|
||||||
import { requireCanViewProject, requireCanModifyProject } from "~/utils/accessControl";
|
|
||||||
import { error, success } from "~/utils/errorHandling/standardResponses";
|
|
||||||
|
|
||||||
export const fineTunesRouter = createTRPCRouter({
|
|
||||||
list: protectedProcedure
|
|
||||||
.input(
|
|
||||||
z.object({
|
|
||||||
projectId: z.string(),
|
|
||||||
page: z.number(),
|
|
||||||
pageSize: z.number(),
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.query(async ({ input, ctx }) => {
|
|
||||||
const { projectId, page, pageSize } = input;
|
|
||||||
|
|
||||||
await requireCanViewProject(projectId, ctx);
|
|
||||||
|
|
||||||
const fineTunes = await prisma.fineTune.findMany({
|
|
||||||
where: {
|
|
||||||
projectId,
|
|
||||||
},
|
|
||||||
include: {
|
|
||||||
dataset: {
|
|
||||||
include: {
|
|
||||||
_count: {
|
|
||||||
select: {
|
|
||||||
datasetEntries: true,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: "asc" },
|
|
||||||
skip: (page - 1) * pageSize,
|
|
||||||
take: pageSize,
|
|
||||||
});
|
|
||||||
|
|
||||||
const count = await prisma.fineTune.count({
|
|
||||||
where: {
|
|
||||||
projectId,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
return {
|
|
||||||
fineTunes,
|
|
||||||
count,
|
|
||||||
};
|
|
||||||
}),
|
|
||||||
create: protectedProcedure
|
|
||||||
.input(
|
|
||||||
z.object({
|
|
||||||
projectId: z.string(),
|
|
||||||
selectedLogIds: z.array(z.string()),
|
|
||||||
slug: z.string(),
|
|
||||||
baseModel: z.string(),
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.mutation(async ({ input, ctx }) => {
|
|
||||||
await requireCanModifyProject(input.projectId, ctx);
|
|
||||||
|
|
||||||
const existingFineTune = await prisma.fineTune.findFirst({
|
|
||||||
where: {
|
|
||||||
slug: input.slug,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (existingFineTune) {
|
|
||||||
return error("A fine tune with that slug already exists");
|
|
||||||
}
|
|
||||||
|
|
||||||
const newDatasetId = uuidv4();
|
|
||||||
|
|
||||||
const datasetEntriesToCreate: Prisma.DatasetEntryCreateManyDatasetInput[] =
|
|
||||||
input.selectedLogIds.map((loggedCallId) => ({
|
|
||||||
loggedCallId,
|
|
||||||
}));
|
|
||||||
|
|
||||||
await prisma.$transaction([
|
|
||||||
prisma.dataset.create({
|
|
||||||
data: {
|
|
||||||
id: newDatasetId,
|
|
||||||
name: input.slug,
|
|
||||||
project: {
|
|
||||||
connect: {
|
|
||||||
id: input.projectId,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
datasetEntries: {
|
|
||||||
createMany: {
|
|
||||||
data: datasetEntriesToCreate,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}),
|
|
||||||
prisma.fineTune.create({
|
|
||||||
data: {
|
|
||||||
projectId: input.projectId,
|
|
||||||
slug: input.slug,
|
|
||||||
baseModel: input.baseModel,
|
|
||||||
datasetId: newDatasetId,
|
|
||||||
},
|
|
||||||
}),
|
|
||||||
]);
|
|
||||||
|
|
||||||
return success();
|
|
||||||
}),
|
|
||||||
});
|
|
||||||
@@ -1,16 +1,11 @@
|
|||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
import { type Expression, type SqlBool, sql, type RawBuilder } from "kysely";
|
import { type Expression, type SqlBool, sql, type RawBuilder } from "kysely";
|
||||||
import { jsonArrayFrom } from "kysely/helpers/postgres";
|
import { jsonArrayFrom } from "kysely/helpers/postgres";
|
||||||
import archiver from "archiver";
|
|
||||||
import { WritableStreamBuffer } from "stream-buffers";
|
|
||||||
import { type JsonValue } from "type-fest";
|
|
||||||
import { shuffle } from "lodash-es";
|
|
||||||
|
|
||||||
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
|
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
|
||||||
import { kysely, prisma } from "~/server/db";
|
import { kysely, prisma } from "~/server/db";
|
||||||
import { comparators, defaultFilterableFields } from "~/state/logFiltersSlice";
|
import { comparators, defaultFilterableFields } from "~/state/logFiltersSlice";
|
||||||
import { requireCanViewProject } from "~/utils/accessControl";
|
import { requireCanViewProject } from "~/utils/accessControl";
|
||||||
import hashObject from "~/server/utils/hashObject";
|
|
||||||
|
|
||||||
// create comparator type based off of comparators
|
// create comparator type based off of comparators
|
||||||
const comparatorToSqlExpression = (comparator: (typeof comparators)[number], value: string) => {
|
const comparatorToSqlExpression = (comparator: (typeof comparators)[number], value: string) => {
|
||||||
@@ -185,102 +180,4 @@ export const loggedCallsRouter = createTRPCRouter({
|
|||||||
|
|
||||||
return tags.map((tag) => tag.name);
|
return tags.map((tag) => tag.name);
|
||||||
}),
|
}),
|
||||||
export: protectedProcedure
|
|
||||||
.input(
|
|
||||||
z.object({
|
|
||||||
projectId: z.string(),
|
|
||||||
selectedLogIds: z.string().array(),
|
|
||||||
testingSplit: z.number(),
|
|
||||||
selectedExportFormat: z.string(),
|
|
||||||
removeDuplicates: z.boolean(),
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
.mutation(async ({ input, ctx }) => {
|
|
||||||
await requireCanViewProject(input.projectId, ctx);
|
|
||||||
|
|
||||||
// Fetch the real data using Prisma
|
|
||||||
const loggedCallsFromDb = await ctx.prisma.loggedCallModelResponse.findMany({
|
|
||||||
where: {
|
|
||||||
originalLoggedCall: {
|
|
||||||
projectId: input.projectId,
|
|
||||||
id: { in: input.selectedLogIds },
|
|
||||||
},
|
|
||||||
statusCode: 200,
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
// Convert the database data into the desired format
|
|
||||||
let formattedLoggedCalls: { instruction: JsonValue[]; output: JsonValue }[] =
|
|
||||||
loggedCallsFromDb.map((call) => ({
|
|
||||||
instruction: (call.reqPayload as unknown as Record<string, unknown>)
|
|
||||||
.messages as JsonValue[],
|
|
||||||
output: (call.respPayload as unknown as { choices: { message: unknown }[] }).choices[0]
|
|
||||||
?.message as JsonValue,
|
|
||||||
}));
|
|
||||||
|
|
||||||
if (input.removeDuplicates) {
|
|
||||||
const deduplicatedLoggedCalls = [];
|
|
||||||
const loggedCallHashSet = new Set<string>();
|
|
||||||
for (const loggedCall of formattedLoggedCalls) {
|
|
||||||
const loggedCallHash = hashObject(loggedCall);
|
|
||||||
if (!loggedCallHashSet.has(loggedCallHash)) {
|
|
||||||
loggedCallHashSet.add(loggedCallHash);
|
|
||||||
deduplicatedLoggedCalls.push(loggedCall);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
formattedLoggedCalls = deduplicatedLoggedCalls;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove duplicate messages from instructions
|
|
||||||
const instructionMessageHashMap = new Map<string, number>();
|
|
||||||
for (const loggedCall of formattedLoggedCalls) {
|
|
||||||
for (const message of loggedCall.instruction) {
|
|
||||||
const hash = hashObject(message);
|
|
||||||
if (instructionMessageHashMap.has(hash)) {
|
|
||||||
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
|
|
||||||
instructionMessageHashMap.set(hash, instructionMessageHashMap.get(hash)! + 1);
|
|
||||||
} else {
|
|
||||||
instructionMessageHashMap.set(hash, 0);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for (const loggedCall of formattedLoggedCalls) {
|
|
||||||
loggedCall.instruction = loggedCall.instruction.filter((message) => {
|
|
||||||
const hash = hashObject(message);
|
|
||||||
// If the same message appears in a single instruction multiple times, there is some danger of
|
|
||||||
// it being removed from all logged calls. This is enough of an edge case that we don't
|
|
||||||
// need to worry about it for now.
|
|
||||||
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
|
|
||||||
return instructionMessageHashMap.get(hash)! < formattedLoggedCalls.length;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Stringify instructions and outputs
|
|
||||||
const stringifiedLoggedCalls = shuffle(formattedLoggedCalls).map((loggedCall) => ({
|
|
||||||
instruction: JSON.stringify(loggedCall.instruction),
|
|
||||||
output: JSON.stringify(loggedCall.output),
|
|
||||||
}));
|
|
||||||
|
|
||||||
const splitIndex = Math.floor((stringifiedLoggedCalls.length * input.testingSplit) / 100);
|
|
||||||
|
|
||||||
const testingData = stringifiedLoggedCalls.slice(0, splitIndex);
|
|
||||||
const trainingData = stringifiedLoggedCalls.slice(splitIndex);
|
|
||||||
|
|
||||||
// Convert arrays to JSONL format
|
|
||||||
const trainingDataJSONL = trainingData.map((item) => JSON.stringify(item)).join("\n");
|
|
||||||
const testingDataJSONL = testingData.map((item) => JSON.stringify(item)).join("\n");
|
|
||||||
|
|
||||||
const output = new WritableStreamBuffer();
|
|
||||||
const archive = archiver("zip");
|
|
||||||
|
|
||||||
archive.pipe(output);
|
|
||||||
archive.append(trainingDataJSONL, { name: "train.jsonl" });
|
|
||||||
archive.append(testingDataJSONL, { name: "test.jsonl" });
|
|
||||||
await archive.finalize();
|
|
||||||
|
|
||||||
// Convert buffer to base64
|
|
||||||
const base64 = output.getContents().toString("base64");
|
|
||||||
|
|
||||||
return base64;
|
|
||||||
}),
|
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ export const scenarioVariantCellsRouter = createTRPCRouter({
|
|||||||
evalsComplete,
|
evalsComplete,
|
||||||
};
|
};
|
||||||
}),
|
}),
|
||||||
hardRefetch: protectedProcedure
|
forceRefetch: protectedProcedure
|
||||||
.input(
|
.input(
|
||||||
z.object({
|
z.object({
|
||||||
scenarioId: z.string(),
|
scenarioId: z.string(),
|
||||||
@@ -85,10 +85,7 @@ export const scenarioVariantCellsRouter = createTRPCRouter({
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (!cell) {
|
if (!cell) {
|
||||||
await generateNewCell(input.variantId, input.scenarioId, {
|
await generateNewCell(input.variantId, input.scenarioId, { stream: true });
|
||||||
stream: true,
|
|
||||||
hardRefetch: true,
|
|
||||||
});
|
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -99,7 +96,7 @@ export const scenarioVariantCellsRouter = createTRPCRouter({
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
await queueQueryModel(cell.id, { stream: true, hardRefetch: true });
|
await queueQueryModel(cell.id, true);
|
||||||
}),
|
}),
|
||||||
getTemplatedPromptMessage: publicProcedure
|
getTemplatedPromptMessage: publicProcedure
|
||||||
.input(
|
.input(
|
||||||
|
|||||||
19
app/src/server/scripts/openai-test.ts
Normal file
19
app/src/server/scripts/openai-test.ts
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
import "dotenv/config";
|
||||||
|
import { openai } from "../utils/openai";
|
||||||
|
|
||||||
|
const resp = await openai.chat.completions.create({
|
||||||
|
model: "gpt-3.5-turbo-0613",
|
||||||
|
stream: true,
|
||||||
|
messages: [
|
||||||
|
{
|
||||||
|
role: "user",
|
||||||
|
content: "count to 20",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
for await (const part of resp) {
|
||||||
|
console.log("part", part);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log("final resp", resp);
|
||||||
@@ -1,4 +1,4 @@
|
|||||||
import { type Helpers, type Task, makeWorkerUtils, TaskSpec } from "graphile-worker";
|
import { type Helpers, type Task, makeWorkerUtils } from "graphile-worker";
|
||||||
import { env } from "~/env.mjs";
|
import { env } from "~/env.mjs";
|
||||||
|
|
||||||
let workerUtilsPromise: ReturnType<typeof makeWorkerUtils> | null = null;
|
let workerUtilsPromise: ReturnType<typeof makeWorkerUtils> | null = null;
|
||||||
@@ -16,11 +16,9 @@ function defineTask<TPayload>(
|
|||||||
taskIdentifier: string,
|
taskIdentifier: string,
|
||||||
taskHandler: (payload: TPayload, helpers: Helpers) => Promise<void>,
|
taskHandler: (payload: TPayload, helpers: Helpers) => Promise<void>,
|
||||||
) {
|
) {
|
||||||
const enqueue = async (payload: TPayload, spec?: TaskSpec) => {
|
const enqueue = async (payload: TPayload, runAt?: Date) => {
|
||||||
console.log("Enqueuing task", taskIdentifier, payload);
|
console.log("Enqueuing task", taskIdentifier, payload);
|
||||||
|
await (await workerUtils()).addJob(taskIdentifier, payload, { runAt });
|
||||||
const utils = await workerUtils();
|
|
||||||
return await utils.addJob(taskIdentifier, payload, spec);
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const handler = (payload: TPayload, helpers: Helpers) => {
|
const handler = (payload: TPayload, helpers: Helpers) => {
|
||||||
|
|||||||
@@ -25,6 +25,7 @@ function calculateDelay(numPreviousTries: number): number {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export const queryModel = defineTask<QueryModelJob>("queryModel", async (task) => {
|
export const queryModel = defineTask<QueryModelJob>("queryModel", async (task) => {
|
||||||
|
console.log("RUNNING TASK", task);
|
||||||
const { cellId, stream, numPreviousTries } = task;
|
const { cellId, stream, numPreviousTries } = task;
|
||||||
const cell = await prisma.scenarioVariantCell.findUnique({
|
const cell = await prisma.scenarioVariantCell.findUnique({
|
||||||
where: { id: cellId },
|
where: { id: cellId },
|
||||||
@@ -152,7 +153,7 @@ export const queryModel = defineTask<QueryModelJob>("queryModel", async (task) =
|
|||||||
stream,
|
stream,
|
||||||
numPreviousTries: numPreviousTries + 1,
|
numPreviousTries: numPreviousTries + 1,
|
||||||
},
|
},
|
||||||
{ runAt: retryTime, jobKey: cellId, priority: 3 },
|
retryTime,
|
||||||
);
|
);
|
||||||
await prisma.scenarioVariantCell.update({
|
await prisma.scenarioVariantCell.update({
|
||||||
where: { id: cellId },
|
where: { id: cellId },
|
||||||
@@ -171,13 +172,7 @@ export const queryModel = defineTask<QueryModelJob>("queryModel", async (task) =
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
export const queueQueryModel = async (
|
export const queueQueryModel = async (cellId: string, stream: boolean) => {
|
||||||
cellId: string,
|
|
||||||
options: { stream?: boolean; hardRefetch?: boolean } = {},
|
|
||||||
) => {
|
|
||||||
// Hard refetches are higher priority than streamed queries, which are higher priority than non-streamed queries.
|
|
||||||
const jobPriority = options.hardRefetch ? 0 : options.stream ? 1 : 2;
|
|
||||||
|
|
||||||
await Promise.all([
|
await Promise.all([
|
||||||
prisma.scenarioVariantCell.update({
|
prisma.scenarioVariantCell.update({
|
||||||
where: {
|
where: {
|
||||||
@@ -189,13 +184,6 @@ export const queueQueryModel = async (
|
|||||||
jobQueuedAt: new Date(),
|
jobQueuedAt: new Date(),
|
||||||
},
|
},
|
||||||
}),
|
}),
|
||||||
|
queryModel.enqueue({ cellId, stream, numPreviousTries: 0 }),
|
||||||
queryModel.enqueue(
|
|
||||||
{ cellId, stream: options.stream ?? false, numPreviousTries: 0 },
|
|
||||||
|
|
||||||
// Streamed queries are higher priority than non-streamed queries. Lower
|
|
||||||
// numbers are higher priority in graphile-worker.
|
|
||||||
{ jobKey: cellId, priority: jobPriority },
|
|
||||||
),
|
|
||||||
]);
|
]);
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -13,6 +13,5 @@ export const runNewEval = defineTask<RunNewEvalJob>("runNewEval", async (task) =
|
|||||||
});
|
});
|
||||||
|
|
||||||
export const queueRunNewEval = async (experimentId: string) => {
|
export const queueRunNewEval = async (experimentId: string) => {
|
||||||
// Evals are lower priority than completions
|
await runNewEval.enqueue({ experimentId });
|
||||||
await runNewEval.enqueue({ experimentId }, { priority: 4 });
|
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,47 +0,0 @@
|
|||||||
import "dotenv/config";
|
|
||||||
|
|
||||||
import defineTask from "./defineTask";
|
|
||||||
import { type TaskList, run } from "graphile-worker";
|
|
||||||
import { env } from "~/env.mjs";
|
|
||||||
|
|
||||||
import "../../../sentry.server.config";
|
|
||||||
|
|
||||||
export type TestTask = { i: number };
|
|
||||||
|
|
||||||
// When a new eval is created, we want to run it on all existing outputs, but return the new eval first
|
|
||||||
export const testTask = defineTask<TestTask>("testTask", (task) => {
|
|
||||||
console.log("ran task ", task.i);
|
|
||||||
|
|
||||||
void new Promise((_resolve, reject) => setTimeout(reject, 500));
|
|
||||||
return Promise.resolve();
|
|
||||||
});
|
|
||||||
|
|
||||||
const registeredTasks = [testTask];
|
|
||||||
|
|
||||||
const taskList = registeredTasks.reduce((acc, task) => {
|
|
||||||
acc[task.task.identifier] = task.task.handler;
|
|
||||||
return acc;
|
|
||||||
}, {} as TaskList);
|
|
||||||
|
|
||||||
// process.on("unhandledRejection", (reason, promise) => {
|
|
||||||
// console.log("Unhandled Rejection at:", reason?.stack || reason);
|
|
||||||
// });
|
|
||||||
|
|
||||||
// Run a worker to execute jobs:
|
|
||||||
const runner = await run({
|
|
||||||
connectionString: env.DATABASE_URL,
|
|
||||||
concurrency: 10,
|
|
||||||
// Install signal handlers for graceful shutdown on SIGINT, SIGTERM, etc
|
|
||||||
noHandleSignals: false,
|
|
||||||
pollInterval: 1000,
|
|
||||||
taskList,
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log("Worker successfully started");
|
|
||||||
|
|
||||||
for (let i = 0; i < 10; i++) {
|
|
||||||
await testTask.enqueue({ i });
|
|
||||||
await new Promise((resolve) => setTimeout(resolve, 1000));
|
|
||||||
}
|
|
||||||
|
|
||||||
await runner.promise;
|
|
||||||
@@ -1,6 +1,5 @@
|
|||||||
import { type TaskList, run } from "graphile-worker";
|
import { type TaskList, run } from "graphile-worker";
|
||||||
import "dotenv/config";
|
import "dotenv/config";
|
||||||
import "../../../sentry.server.config";
|
|
||||||
|
|
||||||
import { env } from "~/env.mjs";
|
import { env } from "~/env.mjs";
|
||||||
import { queryModel } from "./queryModel.task";
|
import { queryModel } from "./queryModel.task";
|
||||||
@@ -18,8 +17,7 @@ const taskList = registeredTasks.reduce((acc, task) => {
|
|||||||
// Run a worker to execute jobs:
|
// Run a worker to execute jobs:
|
||||||
const runner = await run({
|
const runner = await run({
|
||||||
connectionString: env.DATABASE_URL,
|
connectionString: env.DATABASE_URL,
|
||||||
concurrency: env.WORKER_CONCURRENCY,
|
concurrency: 10,
|
||||||
maxPoolSize: env.WORKER_MAX_POOL_SIZE,
|
|
||||||
// Install signal handlers for graceful shutdown on SIGINT, SIGTERM, etc
|
// Install signal handlers for graceful shutdown on SIGINT, SIGTERM, etc
|
||||||
noHandleSignals: false,
|
noHandleSignals: false,
|
||||||
pollInterval: 1000,
|
pollInterval: 1000,
|
||||||
|
|||||||
@@ -41,7 +41,7 @@ const requestUpdatedPromptFunction = async (
|
|||||||
) => {
|
) => {
|
||||||
const originalModelProvider = modelProviders[originalVariant.modelProvider as SupportedProvider];
|
const originalModelProvider = modelProviders[originalVariant.modelProvider as SupportedProvider];
|
||||||
const originalModel = originalModelProvider.models[originalVariant.model] as Model;
|
const originalModel = originalModelProvider.models[originalVariant.model] as Model;
|
||||||
let newConstructionFn = "";
|
let newContructionFn = "";
|
||||||
for (let i = 0; i < NUM_RETRIES; i++) {
|
for (let i = 0; i < NUM_RETRIES; i++) {
|
||||||
try {
|
try {
|
||||||
const messages: CreateChatCompletionRequestMessage[] = [
|
const messages: CreateChatCompletionRequestMessage[] = [
|
||||||
@@ -109,12 +109,6 @@ const requestUpdatedPromptFunction = async (
|
|||||||
function_call: {
|
function_call: {
|
||||||
name: "update_prompt_constructor_function",
|
name: "update_prompt_constructor_function",
|
||||||
},
|
},
|
||||||
openpipe: {
|
|
||||||
tags: {
|
|
||||||
prompt_id: "deriveNewConstructFn",
|
|
||||||
model_translation: (!!newModel).toString(),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
});
|
});
|
||||||
const argString = completion.choices[0]?.message?.function_call?.arguments || "{}";
|
const argString = completion.choices[0]?.message?.function_call?.arguments || "{}";
|
||||||
|
|
||||||
@@ -137,7 +131,7 @@ const requestUpdatedPromptFunction = async (
|
|||||||
const args = await contructPromptFunctionArgs.copy(); // Get the actual value from the isolate
|
const args = await contructPromptFunctionArgs.copy(); // Get the actual value from the isolate
|
||||||
|
|
||||||
if (args && isObject(args) && "new_prompt_function" in args) {
|
if (args && isObject(args) && "new_prompt_function" in args) {
|
||||||
newConstructionFn = await formatPromptConstructor(args.new_prompt_function as string);
|
newContructionFn = await formatPromptConstructor(args.new_prompt_function as string);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
@@ -145,5 +139,5 @@ const requestUpdatedPromptFunction = async (
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return newConstructionFn;
|
return newContructionFn;
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -9,8 +9,10 @@ import parsePromptConstructor from "~/promptConstructor/parse";
|
|||||||
export const generateNewCell = async (
|
export const generateNewCell = async (
|
||||||
variantId: string,
|
variantId: string,
|
||||||
scenarioId: string,
|
scenarioId: string,
|
||||||
options: { stream?: boolean; hardRefetch?: boolean } = {},
|
options?: { stream?: boolean },
|
||||||
): Promise<void> => {
|
): Promise<void> => {
|
||||||
|
const stream = options?.stream ?? false;
|
||||||
|
|
||||||
const variant = await prisma.promptVariant.findUnique({
|
const variant = await prisma.promptVariant.findUnique({
|
||||||
where: {
|
where: {
|
||||||
id: variantId,
|
id: variantId,
|
||||||
@@ -119,6 +121,6 @@ export const generateNewCell = async (
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
} else {
|
} else {
|
||||||
await queueQueryModel(cell.id, options);
|
await queueQueryModel(cell.id, stream);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -17,7 +17,13 @@ try {
|
|||||||
// Set a dummy key so it doesn't fail at build time
|
// Set a dummy key so it doesn't fail at build time
|
||||||
config = {
|
config = {
|
||||||
apiKey: env.OPENAI_API_KEY ?? "dummy-key",
|
apiKey: env.OPENAI_API_KEY ?? "dummy-key",
|
||||||
|
openpipe: {
|
||||||
|
apiKey: env.OPENPIPE_API_KEY,
|
||||||
|
baseUrl: "http://localhost:3000/api/v1",
|
||||||
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// export const openai = env.OPENPIPE_API_KEY ? new OpenAI.OpenAI(config) : new OriginalOpenAI(config);
|
||||||
|
|
||||||
export const openai = new OpenAI(config);
|
export const openai = new OpenAI(config);
|
||||||
|
|||||||
@@ -53,11 +53,6 @@ export const runGpt4Eval = async (
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
openpipe: {
|
|
||||||
tags: {
|
|
||||||
prompt_id: "runOneEval",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
});
|
});
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
|||||||
@@ -1,37 +0,0 @@
|
|||||||
import { type SliceCreator } from "./store";
|
|
||||||
|
|
||||||
export const comparators = ["=", "!=", "CONTAINS", "NOT_CONTAINS"] as const;
|
|
||||||
|
|
||||||
export const defaultFilterableFields = ["Request", "Response", "Model", "Status Code"] as const;
|
|
||||||
|
|
||||||
export enum StaticColumnKeys {
|
|
||||||
SENT_AT = "sentAt",
|
|
||||||
MODEL = "model",
|
|
||||||
DURATION = "duration",
|
|
||||||
INPUT_TOKENS = "inputTokens",
|
|
||||||
OUTPUT_TOKENS = "outputTokens",
|
|
||||||
STATUS_CODE = "statusCode",
|
|
||||||
}
|
|
||||||
|
|
||||||
export type ColumnVisibilitySlice = {
|
|
||||||
visibleColumns: Set<string>;
|
|
||||||
toggleColumnVisibility: (columnKey: string) => void;
|
|
||||||
showAllColumns: (columnKeys: string[]) => void;
|
|
||||||
};
|
|
||||||
|
|
||||||
export const createColumnVisibilitySlice: SliceCreator<ColumnVisibilitySlice> = (set, get) => ({
|
|
||||||
// initialize with all static columns visible
|
|
||||||
visibleColumns: new Set(Object.values(StaticColumnKeys)),
|
|
||||||
toggleColumnVisibility: (columnKey: string) =>
|
|
||||||
set((state) => {
|
|
||||||
if (state.columnVisibility.visibleColumns.has(columnKey)) {
|
|
||||||
state.columnVisibility.visibleColumns.delete(columnKey);
|
|
||||||
} else {
|
|
||||||
state.columnVisibility.visibleColumns.add(columnKey);
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
showAllColumns: (columnKeys: string[]) =>
|
|
||||||
set((state) => {
|
|
||||||
state.columnVisibility.visibleColumns = new Set(columnKeys);
|
|
||||||
}),
|
|
||||||
});
|
|
||||||
@@ -1,23 +0,0 @@
|
|||||||
import { type SliceCreator } from "./store";
|
|
||||||
|
|
||||||
export type FeatureFlagsSlice = {
|
|
||||||
flagsLoaded: boolean;
|
|
||||||
featureFlags: {
|
|
||||||
betaAccess: boolean;
|
|
||||||
};
|
|
||||||
setFeatureFlags: (flags: string[] | undefined) => void;
|
|
||||||
};
|
|
||||||
|
|
||||||
export const createFeatureFlagsSlice: SliceCreator<FeatureFlagsSlice> = (set) => ({
|
|
||||||
flagsLoaded: false,
|
|
||||||
featureFlags: {
|
|
||||||
betaAccess: false,
|
|
||||||
},
|
|
||||||
setFeatureFlags: (flags) =>
|
|
||||||
set((state) => {
|
|
||||||
state.featureFlags.featureFlags = {
|
|
||||||
betaAccess: flags?.includes("betaAccess") ?? false,
|
|
||||||
};
|
|
||||||
state.featureFlags.flagsLoaded = true;
|
|
||||||
}),
|
|
||||||
});
|
|
||||||
@@ -1,27 +1,13 @@
|
|||||||
import { type PersistOptions } from "zustand/middleware/persist";
|
import { type PersistOptions } from "zustand/middleware/persist";
|
||||||
import { type State } from "./store";
|
import { type State } from "./store";
|
||||||
import SuperJSON from "superjson";
|
|
||||||
import { merge, pick } from "lodash-es";
|
|
||||||
import { type PartialDeep } from "type-fest";
|
|
||||||
|
|
||||||
export type PersistedState = PartialDeep<State>;
|
export const stateToPersist = {
|
||||||
|
selectedProjectId: null as string | null,
|
||||||
|
};
|
||||||
|
|
||||||
export const persistOptions: PersistOptions<State, PersistedState> = {
|
export const persistOptions: PersistOptions<State, typeof stateToPersist> = {
|
||||||
name: "persisted-app-store",
|
name: "persisted-app-store",
|
||||||
partialize: (state) => ({
|
partialize: (state) => ({
|
||||||
selectedProjectId: state.selectedProjectId,
|
selectedProjectId: state.selectedProjectId,
|
||||||
columnVisibility: pick(state.columnVisibility, ["visibleColumns"]),
|
|
||||||
}),
|
}),
|
||||||
merge: (saved, state) => merge(state, saved),
|
|
||||||
storage: {
|
|
||||||
getItem: (key) => {
|
|
||||||
const data = localStorage.getItem(key);
|
|
||||||
return data ? SuperJSON.parse(data) : null;
|
|
||||||
},
|
|
||||||
setItem: (key, value) => localStorage.setItem(key, SuperJSON.stringify(value)),
|
|
||||||
removeItem: (key) => localStorage.removeItem(key),
|
|
||||||
},
|
|
||||||
onRehydrateStorage: (state) => {
|
|
||||||
if (state) state.isRehydrated = true;
|
|
||||||
},
|
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -8,16 +8,13 @@ import {
|
|||||||
createVariantEditorSlice,
|
createVariantEditorSlice,
|
||||||
} from "./sharedVariantEditor.slice";
|
} from "./sharedVariantEditor.slice";
|
||||||
import { type APIClient } from "~/utils/api";
|
import { type APIClient } from "~/utils/api";
|
||||||
import { type PersistedState, persistOptions } from "./persist";
|
import { persistOptions, type stateToPersist } from "./persist";
|
||||||
import { type SelectedLogsSlice, createSelectedLogsSlice } from "./selectedLogsSlice";
|
import { type SelectedLogsSlice, createSelectedLogsSlice } from "./selectedLogsSlice";
|
||||||
import { type LogFiltersSlice, createLogFiltersSlice } from "./logFiltersSlice";
|
import { type LogFiltersSlice, createLogFiltersSlice } from "./logFiltersSlice";
|
||||||
import { type ColumnVisibilitySlice, createColumnVisibilitySlice } from "./columnVisiblitySlice";
|
|
||||||
import { type FeatureFlagsSlice, createFeatureFlagsSlice } from "./featureFlags";
|
|
||||||
|
|
||||||
enableMapSet();
|
enableMapSet();
|
||||||
|
|
||||||
export type State = {
|
export type State = {
|
||||||
isRehydrated: boolean;
|
|
||||||
drawerOpen: boolean;
|
drawerOpen: boolean;
|
||||||
openDrawer: () => void;
|
openDrawer: () => void;
|
||||||
closeDrawer: () => void;
|
closeDrawer: () => void;
|
||||||
@@ -28,8 +25,6 @@ export type State = {
|
|||||||
setSelectedProjectId: (id: string) => void;
|
setSelectedProjectId: (id: string) => void;
|
||||||
selectedLogs: SelectedLogsSlice;
|
selectedLogs: SelectedLogsSlice;
|
||||||
logFilters: LogFiltersSlice;
|
logFilters: LogFiltersSlice;
|
||||||
columnVisibility: ColumnVisibilitySlice;
|
|
||||||
featureFlags: FeatureFlagsSlice;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
export type SliceCreator<T> = StateCreator<State, [["zustand/immer", never]], [], T>;
|
export type SliceCreator<T> = StateCreator<State, [["zustand/immer", never]], [], T>;
|
||||||
@@ -37,15 +32,18 @@ export type SliceCreator<T> = StateCreator<State, [["zustand/immer", never]], []
|
|||||||
export type SetFn = Parameters<SliceCreator<unknown>>[0];
|
export type SetFn = Parameters<SliceCreator<unknown>>[0];
|
||||||
export type GetFn = Parameters<SliceCreator<unknown>>[1];
|
export type GetFn = Parameters<SliceCreator<unknown>>[1];
|
||||||
|
|
||||||
const useBaseStore = create<State, [["zustand/persist", PersistedState], ["zustand/immer", never]]>(
|
const useBaseStore = create<
|
||||||
|
State,
|
||||||
|
[["zustand/persist", typeof stateToPersist], ["zustand/immer", never]]
|
||||||
|
>(
|
||||||
persist(
|
persist(
|
||||||
immer((set, get, ...rest) => ({
|
immer((set, get, ...rest) => ({
|
||||||
isRehydrated: false,
|
|
||||||
api: null,
|
api: null,
|
||||||
setApi: (api) =>
|
setApi: (api) =>
|
||||||
set((state) => {
|
set((state) => {
|
||||||
state.api = api;
|
state.api = api;
|
||||||
}),
|
}),
|
||||||
|
|
||||||
drawerOpen: false,
|
drawerOpen: false,
|
||||||
openDrawer: () =>
|
openDrawer: () =>
|
||||||
set((state) => {
|
set((state) => {
|
||||||
@@ -63,8 +61,6 @@ const useBaseStore = create<State, [["zustand/persist", PersistedState], ["zusta
|
|||||||
}),
|
}),
|
||||||
selectedLogs: createSelectedLogsSlice(set, get, ...rest),
|
selectedLogs: createSelectedLogsSlice(set, get, ...rest),
|
||||||
logFilters: createLogFiltersSlice(set, get, ...rest),
|
logFilters: createLogFiltersSlice(set, get, ...rest),
|
||||||
columnVisibility: createColumnVisibilitySlice(set, get, ...rest),
|
|
||||||
featureFlags: createFeatureFlagsSlice(set, get, ...rest),
|
|
||||||
})),
|
})),
|
||||||
persistOptions,
|
persistOptions,
|
||||||
),
|
),
|
||||||
|
|||||||
@@ -78,6 +78,33 @@ export const requireCanModifyProject = async (projectId: string, ctx: TRPCContex
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const requireCanViewDataset = async (datasetId: string, ctx: TRPCContext) => {
|
||||||
|
ctx.markAccessControlRun();
|
||||||
|
|
||||||
|
const dataset = await prisma.dataset.findFirst({
|
||||||
|
where: {
|
||||||
|
id: datasetId,
|
||||||
|
project: {
|
||||||
|
projectUsers: {
|
||||||
|
some: {
|
||||||
|
role: { in: [ProjectUserRole.ADMIN, ProjectUserRole.MEMBER] },
|
||||||
|
userId: ctx.session?.user.id,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!dataset) {
|
||||||
|
throw new TRPCError({ code: "UNAUTHORIZED" });
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const requireCanModifyDataset = async (datasetId: string, ctx: TRPCContext) => {
|
||||||
|
// Right now all users who can view a dataset can also modify it
|
||||||
|
await requireCanViewDataset(datasetId, ctx);
|
||||||
|
};
|
||||||
|
|
||||||
export const requireCanViewExperiment = (experimentId: string, ctx: TRPCContext): Promise<void> => {
|
export const requireCanViewExperiment = (experimentId: string, ctx: TRPCContext): Promise<void> => {
|
||||||
// Right now all experiments are publicly viewable, so this is a no-op.
|
// Right now all experiments are publicly viewable, so this is a no-op.
|
||||||
ctx.markAccessControlRun();
|
ctx.markAccessControlRun();
|
||||||
|
|||||||
@@ -1,12 +1,11 @@
|
|||||||
"use client";
|
"use client";
|
||||||
import { useSession } from "next-auth/react";
|
import { useSession } from "next-auth/react";
|
||||||
import React, { type ReactNode, useEffect } from "react";
|
import React, { type ReactNode, useEffect } from "react";
|
||||||
import { PostHogProvider, useActiveFeatureFlags } from "posthog-js/react";
|
import { PostHogProvider } from "posthog-js/react";
|
||||||
|
|
||||||
import posthog from "posthog-js";
|
import posthog from "posthog-js";
|
||||||
import { env } from "~/env.mjs";
|
import { env } from "~/env.mjs";
|
||||||
import { useRouter } from "next/router";
|
import { useRouter } from "next/router";
|
||||||
import { useAppStore } from "~/state/store";
|
|
||||||
|
|
||||||
// Make sure we're in the browser
|
// Make sure we're in the browser
|
||||||
const inBrowser = typeof window !== "undefined";
|
const inBrowser = typeof window !== "undefined";
|
||||||
@@ -25,14 +24,6 @@ export const PosthogAppProvider = ({ children }: { children: ReactNode }) => {
|
|||||||
};
|
};
|
||||||
}, [router.events]);
|
}, [router.events]);
|
||||||
|
|
||||||
const setFeatureFlags = useAppStore((s) => s.featureFlags.setFeatureFlags);
|
|
||||||
const activeFlags = useActiveFeatureFlags();
|
|
||||||
useEffect(() => {
|
|
||||||
if (activeFlags) {
|
|
||||||
setFeatureFlags(activeFlags);
|
|
||||||
}
|
|
||||||
}, [activeFlags, setFeatureFlags]);
|
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (env.NEXT_PUBLIC_POSTHOG_KEY && inBrowser && session && session.user) {
|
if (env.NEXT_PUBLIC_POSTHOG_KEY && inBrowser && session && session.user) {
|
||||||
posthog.init(env.NEXT_PUBLIC_POSTHOG_KEY, {
|
posthog.init(env.NEXT_PUBLIC_POSTHOG_KEY, {
|
||||||
|
|||||||
@@ -26,6 +26,34 @@ export const useExperimentAccess = () => {
|
|||||||
return useExperiment().data?.access ?? { canView: false, canModify: false };
|
return useExperiment().data?.access ?? { canView: false, canModify: false };
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const useDatasets = () => {
|
||||||
|
const selectedProjectId = useAppStore((state) => state.selectedProjectId);
|
||||||
|
return api.datasets.list.useQuery(
|
||||||
|
{ projectId: selectedProjectId ?? "" },
|
||||||
|
{ enabled: !!selectedProjectId },
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDataset = () => {
|
||||||
|
const router = useRouter();
|
||||||
|
const dataset = api.datasets.get.useQuery(
|
||||||
|
{ id: router.query.id as string },
|
||||||
|
{ enabled: !!router.query.id },
|
||||||
|
);
|
||||||
|
|
||||||
|
return dataset;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDatasetEntries = () => {
|
||||||
|
const dataset = useDataset();
|
||||||
|
const { page, pageSize } = usePageParams();
|
||||||
|
|
||||||
|
return api.datasetEntries.list.useQuery(
|
||||||
|
{ datasetId: dataset.data?.id ?? "", page, pageSize },
|
||||||
|
{ enabled: dataset.data?.id != null },
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
type AsyncFunction<T extends unknown[], U> = (...args: T) => Promise<U>;
|
type AsyncFunction<T extends unknown[], U> = (...args: T) => Promise<U>;
|
||||||
|
|
||||||
export function useHandledAsyncCallback<T extends unknown[], U>(
|
export function useHandledAsyncCallback<T extends unknown[], U>(
|
||||||
@@ -177,22 +205,3 @@ export const useTagNames = () => {
|
|||||||
{ enabled: !!selectedProjectId },
|
{ enabled: !!selectedProjectId },
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
export const useFineTunes = () => {
|
|
||||||
const selectedProjectId = useAppStore((state) => state.selectedProjectId);
|
|
||||||
const { page, pageSize } = usePageParams();
|
|
||||||
|
|
||||||
return api.fineTunes.list.useQuery(
|
|
||||||
{ projectId: selectedProjectId ?? "", page, pageSize },
|
|
||||||
{ enabled: !!selectedProjectId },
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export const useIsClientRehydrated = () => {
|
|
||||||
const isRehydrated = useAppStore((state) => state.isRehydrated);
|
|
||||||
const [isMounted, setIsMounted] = useState(false);
|
|
||||||
useEffect(() => {
|
|
||||||
setIsMounted(true);
|
|
||||||
}, []);
|
|
||||||
return isRehydrated && isMounted;
|
|
||||||
};
|
|
||||||
|
|||||||
9
app/test-docker.sh
Executable file
9
app/test-docker.sh
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#! /bin/bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
cd "$(dirname "$0")/.."
|
||||||
|
|
||||||
|
source app/.env
|
||||||
|
|
||||||
|
docker build . --file app/Dockerfile
|
||||||
@@ -141,19 +141,9 @@
|
|||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"status": {
|
"status": {
|
||||||
"anyOf": [
|
"type": "string",
|
||||||
{
|
"enum": [
|
||||||
"type": "string",
|
"ok"
|
||||||
"enum": [
|
|
||||||
"ok"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "string",
|
|
||||||
"enum": [
|
|
||||||
"error"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -1,40 +0,0 @@
|
|||||||
# OpenPipe Python Client
|
|
||||||
|
|
||||||
This client allows you automatically report your OpenAI calls to [OpenPipe](https://openpipe.ai/). OpenPipe
|
|
||||||
|
|
||||||
## Installation
|
|
||||||
`pip install openpipe`
|
|
||||||
|
|
||||||
## Usage
|
|
||||||
|
|
||||||
1. Create a project at https://app.openpipe.ai
|
|
||||||
2. Find your project's API key at https://app.openpipe.ai/project/settings
|
|
||||||
3. Configure the OpenPipe client as shown below.
|
|
||||||
|
|
||||||
```python
|
|
||||||
from openpipe import openai, configure_openpipe
|
|
||||||
import os
|
|
||||||
|
|
||||||
# Set the OpenPipe API key you got in step (3) above.
|
|
||||||
# If you have the `OPENPIPE_API_KEY` environment variable set we'll read from it by default.
|
|
||||||
configure_openpipe(api_key=os.getenv("OPENPIPE_API_KEY"))
|
|
||||||
|
|
||||||
# Configure OpenAI the same way you would normally
|
|
||||||
openai.api_key = os.getenv("OPENAI_API_KEY")
|
|
||||||
```
|
|
||||||
|
|
||||||
You can use the OpenPipe client for normal
|
|
||||||
|
|
||||||
## Special Features
|
|
||||||
|
|
||||||
### Tagging
|
|
||||||
|
|
||||||
OpenPipe has a concept of "tagging." This is very useful for grouping a certain set of completions together. When you're using a dataset for fine-tuning, you can select all the prompts that match a certain set of tags. Here's how you can use the tagging feature:
|
|
||||||
|
|
||||||
```python
|
|
||||||
completion = openai.ChatCompletion.create(
|
|
||||||
model="gpt-3.5-turbo",
|
|
||||||
messages=[{"role": "system", "content": "count to 10"}],
|
|
||||||
openpipe={"tags": {"prompt_id": "counting"}},
|
|
||||||
)
|
|
||||||
```
|
|
||||||
@@ -1,202 +0,0 @@
|
|||||||
|
|
||||||
Apache License
|
|
||||||
Version 2.0, January 2004
|
|
||||||
http://www.apache.org/licenses/
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
|
||||||
|
|
||||||
1. Definitions.
|
|
||||||
|
|
||||||
"License" shall mean the terms and conditions for use, reproduction,
|
|
||||||
and distribution as defined by Sections 1 through 9 of this document.
|
|
||||||
|
|
||||||
"Licensor" shall mean the copyright owner or entity authorized by
|
|
||||||
the copyright owner that is granting the License.
|
|
||||||
|
|
||||||
"Legal Entity" shall mean the union of the acting entity and all
|
|
||||||
other entities that control, are controlled by, or are under common
|
|
||||||
control with that entity. For the purposes of this definition,
|
|
||||||
"control" means (i) the power, direct or indirect, to cause the
|
|
||||||
direction or management of such entity, whether by contract or
|
|
||||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
|
||||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
|
||||||
|
|
||||||
"You" (or "Your") shall mean an individual or Legal Entity
|
|
||||||
exercising permissions granted by this License.
|
|
||||||
|
|
||||||
"Source" form shall mean the preferred form for making modifications,
|
|
||||||
including but not limited to software source code, documentation
|
|
||||||
source, and configuration files.
|
|
||||||
|
|
||||||
"Object" form shall mean any form resulting from mechanical
|
|
||||||
transformation or translation of a Source form, including but
|
|
||||||
not limited to compiled object code, generated documentation,
|
|
||||||
and conversions to other media types.
|
|
||||||
|
|
||||||
"Work" shall mean the work of authorship, whether in Source or
|
|
||||||
Object form, made available under the License, as indicated by a
|
|
||||||
copyright notice that is included in or attached to the work
|
|
||||||
(an example is provided in the Appendix below).
|
|
||||||
|
|
||||||
"Derivative Works" shall mean any work, whether in Source or Object
|
|
||||||
form, that is based on (or derived from) the Work and for which the
|
|
||||||
editorial revisions, annotations, elaborations, or other modifications
|
|
||||||
represent, as a whole, an original work of authorship. For the purposes
|
|
||||||
of this License, Derivative Works shall not include works that remain
|
|
||||||
separable from, or merely link (or bind by name) to the interfaces of,
|
|
||||||
the Work and Derivative Works thereof.
|
|
||||||
|
|
||||||
"Contribution" shall mean any work of authorship, including
|
|
||||||
the original version of the Work and any modifications or additions
|
|
||||||
to that Work or Derivative Works thereof, that is intentionally
|
|
||||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
|
||||||
or by an individual or Legal Entity authorized to submit on behalf of
|
|
||||||
the copyright owner. For the purposes of this definition, "submitted"
|
|
||||||
means any form of electronic, verbal, or written communication sent
|
|
||||||
to the Licensor or its representatives, including but not limited to
|
|
||||||
communication on electronic mailing lists, source code control systems,
|
|
||||||
and issue tracking systems that are managed by, or on behalf of, the
|
|
||||||
Licensor for the purpose of discussing and improving the Work, but
|
|
||||||
excluding communication that is conspicuously marked or otherwise
|
|
||||||
designated in writing by the copyright owner as "Not a Contribution."
|
|
||||||
|
|
||||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
|
||||||
on behalf of whom a Contribution has been received by Licensor and
|
|
||||||
subsequently incorporated within the Work.
|
|
||||||
|
|
||||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
copyright license to reproduce, prepare Derivative Works of,
|
|
||||||
publicly display, publicly perform, sublicense, and distribute the
|
|
||||||
Work and such Derivative Works in Source or Object form.
|
|
||||||
|
|
||||||
3. Grant of Patent License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
(except as stated in this section) patent license to make, have made,
|
|
||||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
|
||||||
where such license applies only to those patent claims licensable
|
|
||||||
by such Contributor that are necessarily infringed by their
|
|
||||||
Contribution(s) alone or by combination of their Contribution(s)
|
|
||||||
with the Work to which such Contribution(s) was submitted. If You
|
|
||||||
institute patent litigation against any entity (including a
|
|
||||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
|
||||||
or a Contribution incorporated within the Work constitutes direct
|
|
||||||
or contributory patent infringement, then any patent licenses
|
|
||||||
granted to You under this License for that Work shall terminate
|
|
||||||
as of the date such litigation is filed.
|
|
||||||
|
|
||||||
4. Redistribution. You may reproduce and distribute copies of the
|
|
||||||
Work or Derivative Works thereof in any medium, with or without
|
|
||||||
modifications, and in Source or Object form, provided that You
|
|
||||||
meet the following conditions:
|
|
||||||
|
|
||||||
(a) You must give any other recipients of the Work or
|
|
||||||
Derivative Works a copy of this License; and
|
|
||||||
|
|
||||||
(b) You must cause any modified files to carry prominent notices
|
|
||||||
stating that You changed the files; and
|
|
||||||
|
|
||||||
(c) You must retain, in the Source form of any Derivative Works
|
|
||||||
that You distribute, all copyright, patent, trademark, and
|
|
||||||
attribution notices from the Source form of the Work,
|
|
||||||
excluding those notices that do not pertain to any part of
|
|
||||||
the Derivative Works; and
|
|
||||||
|
|
||||||
(d) If the Work includes a "NOTICE" text file as part of its
|
|
||||||
distribution, then any Derivative Works that You distribute must
|
|
||||||
include a readable copy of the attribution notices contained
|
|
||||||
within such NOTICE file, excluding those notices that do not
|
|
||||||
pertain to any part of the Derivative Works, in at least one
|
|
||||||
of the following places: within a NOTICE text file distributed
|
|
||||||
as part of the Derivative Works; within the Source form or
|
|
||||||
documentation, if provided along with the Derivative Works; or,
|
|
||||||
within a display generated by the Derivative Works, if and
|
|
||||||
wherever such third-party notices normally appear. The contents
|
|
||||||
of the NOTICE file are for informational purposes only and
|
|
||||||
do not modify the License. You may add Your own attribution
|
|
||||||
notices within Derivative Works that You distribute, alongside
|
|
||||||
or as an addendum to the NOTICE text from the Work, provided
|
|
||||||
that such additional attribution notices cannot be construed
|
|
||||||
as modifying the License.
|
|
||||||
|
|
||||||
You may add Your own copyright statement to Your modifications and
|
|
||||||
may provide additional or different license terms and conditions
|
|
||||||
for use, reproduction, or distribution of Your modifications, or
|
|
||||||
for any such Derivative Works as a whole, provided Your use,
|
|
||||||
reproduction, and distribution of the Work otherwise complies with
|
|
||||||
the conditions stated in this License.
|
|
||||||
|
|
||||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
|
||||||
any Contribution intentionally submitted for inclusion in the Work
|
|
||||||
by You to the Licensor shall be under the terms and conditions of
|
|
||||||
this License, without any additional terms or conditions.
|
|
||||||
Notwithstanding the above, nothing herein shall supersede or modify
|
|
||||||
the terms of any separate license agreement you may have executed
|
|
||||||
with Licensor regarding such Contributions.
|
|
||||||
|
|
||||||
6. Trademarks. This License does not grant permission to use the trade
|
|
||||||
names, trademarks, service marks, or product names of the Licensor,
|
|
||||||
except as required for reasonable and customary use in describing the
|
|
||||||
origin of the Work and reproducing the content of the NOTICE file.
|
|
||||||
|
|
||||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
|
||||||
agreed to in writing, Licensor provides the Work (and each
|
|
||||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
|
||||||
implied, including, without limitation, any warranties or conditions
|
|
||||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
|
||||||
appropriateness of using or redistributing the Work and assume any
|
|
||||||
risks associated with Your exercise of permissions under this License.
|
|
||||||
|
|
||||||
8. Limitation of Liability. In no event and under no legal theory,
|
|
||||||
whether in tort (including negligence), contract, or otherwise,
|
|
||||||
unless required by applicable law (such as deliberate and grossly
|
|
||||||
negligent acts) or agreed to in writing, shall any Contributor be
|
|
||||||
liable to You for damages, including any direct, indirect, special,
|
|
||||||
incidental, or consequential damages of any character arising as a
|
|
||||||
result of this License or out of the use or inability to use the
|
|
||||||
Work (including but not limited to damages for loss of goodwill,
|
|
||||||
work stoppage, computer failure or malfunction, or any and all
|
|
||||||
other commercial damages or losses), even if such Contributor
|
|
||||||
has been advised of the possibility of such damages.
|
|
||||||
|
|
||||||
9. Accepting Warranty or Additional Liability. While redistributing
|
|
||||||
the Work or Derivative Works thereof, You may choose to offer,
|
|
||||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
|
||||||
or other liability obligations and/or rights consistent with this
|
|
||||||
License. However, in accepting such obligations, You may act only
|
|
||||||
on Your own behalf and on Your sole responsibility, not on behalf
|
|
||||||
of any other Contributor, and only if You agree to indemnify,
|
|
||||||
defend, and hold each Contributor harmless for any liability
|
|
||||||
incurred by, or claims asserted against, such Contributor by reason
|
|
||||||
of your accepting any such warranty or additional liability.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
APPENDIX: How to apply the Apache License to your work.
|
|
||||||
|
|
||||||
To apply the Apache License to your work, attach the following
|
|
||||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
|
||||||
replaced with your own identifying information. (Don't include
|
|
||||||
the brackets!) The text should be enclosed in the appropriate
|
|
||||||
comment syntax for the file format. We also recommend that a
|
|
||||||
file or class name and description of purpose be included on the
|
|
||||||
same "printed page" as the copyright notice for easier
|
|
||||||
identification within third-party archives.
|
|
||||||
|
|
||||||
Copyright [yyyy] [name of copyright owner]
|
|
||||||
|
|
||||||
Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
you may not use this file except in compliance with the License.
|
|
||||||
You may obtain a copy of the License at
|
|
||||||
|
|
||||||
http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
|
|
||||||
Unless required by applicable law or agreed to in writing, software
|
|
||||||
distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
See the License for the specific language governing permissions and
|
|
||||||
limitations under the License.
|
|
||||||
@@ -13,8 +13,7 @@ from .local_testing_only_get_latest_logged_call_response_200_tags import (
|
|||||||
from .report_json_body import ReportJsonBody
|
from .report_json_body import ReportJsonBody
|
||||||
from .report_json_body_tags import ReportJsonBodyTags
|
from .report_json_body_tags import ReportJsonBodyTags
|
||||||
from .report_response_200 import ReportResponse200
|
from .report_response_200 import ReportResponse200
|
||||||
from .report_response_200_status_type_0 import ReportResponse200StatusType0
|
from .report_response_200_status import ReportResponse200Status
|
||||||
from .report_response_200_status_type_1 import ReportResponse200StatusType1
|
|
||||||
|
|
||||||
__all__ = (
|
__all__ = (
|
||||||
"CheckCacheJsonBody",
|
"CheckCacheJsonBody",
|
||||||
@@ -26,6 +25,5 @@ __all__ = (
|
|||||||
"ReportJsonBody",
|
"ReportJsonBody",
|
||||||
"ReportJsonBodyTags",
|
"ReportJsonBodyTags",
|
||||||
"ReportResponse200",
|
"ReportResponse200",
|
||||||
"ReportResponse200StatusType0",
|
"ReportResponse200Status",
|
||||||
"ReportResponse200StatusType1",
|
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -1,9 +1,8 @@
|
|||||||
from typing import Any, Dict, Type, TypeVar, Union
|
from typing import Any, Dict, Type, TypeVar
|
||||||
|
|
||||||
from attrs import define
|
from attrs import define
|
||||||
|
|
||||||
from ..models.report_response_200_status_type_0 import ReportResponse200StatusType0
|
from ..models.report_response_200_status import ReportResponse200Status
|
||||||
from ..models.report_response_200_status_type_1 import ReportResponse200StatusType1
|
|
||||||
|
|
||||||
T = TypeVar("T", bound="ReportResponse200")
|
T = TypeVar("T", bound="ReportResponse200")
|
||||||
|
|
||||||
@@ -12,19 +11,13 @@ T = TypeVar("T", bound="ReportResponse200")
|
|||||||
class ReportResponse200:
|
class ReportResponse200:
|
||||||
"""
|
"""
|
||||||
Attributes:
|
Attributes:
|
||||||
status (Union[ReportResponse200StatusType0, ReportResponse200StatusType1]):
|
status (ReportResponse200Status):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
status: Union[ReportResponse200StatusType0, ReportResponse200StatusType1]
|
status: ReportResponse200Status
|
||||||
|
|
||||||
def to_dict(self) -> Dict[str, Any]:
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
status: str
|
status = self.status.value
|
||||||
|
|
||||||
if isinstance(self.status, ReportResponse200StatusType0):
|
|
||||||
status = self.status.value
|
|
||||||
|
|
||||||
else:
|
|
||||||
status = self.status.value
|
|
||||||
|
|
||||||
field_dict: Dict[str, Any] = {}
|
field_dict: Dict[str, Any] = {}
|
||||||
field_dict.update(
|
field_dict.update(
|
||||||
@@ -38,23 +31,7 @@ class ReportResponse200:
|
|||||||
@classmethod
|
@classmethod
|
||||||
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
|
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
|
||||||
d = src_dict.copy()
|
d = src_dict.copy()
|
||||||
|
status = ReportResponse200Status(d.pop("status"))
|
||||||
def _parse_status(data: object) -> Union[ReportResponse200StatusType0, ReportResponse200StatusType1]:
|
|
||||||
try:
|
|
||||||
if not isinstance(data, str):
|
|
||||||
raise TypeError()
|
|
||||||
status_type_0 = ReportResponse200StatusType0(data)
|
|
||||||
|
|
||||||
return status_type_0
|
|
||||||
except: # noqa: E722
|
|
||||||
pass
|
|
||||||
if not isinstance(data, str):
|
|
||||||
raise TypeError()
|
|
||||||
status_type_1 = ReportResponse200StatusType1(data)
|
|
||||||
|
|
||||||
return status_type_1
|
|
||||||
|
|
||||||
status = _parse_status(d.pop("status"))
|
|
||||||
|
|
||||||
report_response_200 = cls(
|
report_response_200 = cls(
|
||||||
status=status,
|
status=status,
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
from enum import Enum
|
from enum import Enum
|
||||||
|
|
||||||
|
|
||||||
class ReportResponse200StatusType0(str, Enum):
|
class ReportResponse200Status(str, Enum):
|
||||||
OK = "ok"
|
OK = "ok"
|
||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self) -> str:
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
from enum import Enum
|
|
||||||
|
|
||||||
|
|
||||||
class ReportResponse200StatusType1(str, Enum):
|
|
||||||
ERROR = "error"
|
|
||||||
|
|
||||||
def __str__(self) -> str:
|
|
||||||
return str(self.value)
|
|
||||||
@@ -4,7 +4,7 @@ import time
|
|||||||
import inspect
|
import inspect
|
||||||
|
|
||||||
from openpipe.merge_openai_chunks import merge_openai_chunks
|
from openpipe.merge_openai_chunks import merge_openai_chunks
|
||||||
from openpipe.openpipe_meta import openpipe_meta
|
from openpipe.openpipe_meta import OpenPipeMeta
|
||||||
|
|
||||||
from .shared import (
|
from .shared import (
|
||||||
_should_check_cache,
|
_should_check_cache,
|
||||||
@@ -41,11 +41,9 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
|
|||||||
)
|
)
|
||||||
|
|
||||||
cache_status = (
|
cache_status = (
|
||||||
"MISS"
|
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
|
||||||
if _should_check_cache(openpipe_options, kwargs)
|
|
||||||
else "SKIP"
|
|
||||||
)
|
)
|
||||||
chunk.openpipe = openpipe_meta(cache_status=cache_status)
|
chunk.openpipe = OpenPipeMeta(cache_status=cache_status)
|
||||||
|
|
||||||
yield chunk
|
yield chunk
|
||||||
|
|
||||||
@@ -74,9 +72,9 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
|
|||||||
)
|
)
|
||||||
|
|
||||||
cache_status = (
|
cache_status = (
|
||||||
"MISS" if _should_check_cache(openpipe_options, kwargs) else "SKIP"
|
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
|
||||||
)
|
)
|
||||||
chat_completion["openpipe"] = openpipe_meta(cache_status=cache_status)
|
chat_completion["openpipe"] = OpenPipeMeta(cache_status=cache_status)
|
||||||
return chat_completion
|
return chat_completion
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
received_at = int(time.time() * 1000)
|
received_at = int(time.time() * 1000)
|
||||||
@@ -128,11 +126,9 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
|
|||||||
assembled_completion, chunk
|
assembled_completion, chunk
|
||||||
)
|
)
|
||||||
cache_status = (
|
cache_status = (
|
||||||
"MISS"
|
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
|
||||||
if _should_check_cache(openpipe_options, kwargs)
|
|
||||||
else "SKIP"
|
|
||||||
)
|
)
|
||||||
chunk.openpipe = openpipe_meta(cache_status=cache_status)
|
chunk.openpipe = OpenPipeMeta(cache_status=cache_status)
|
||||||
|
|
||||||
yield chunk
|
yield chunk
|
||||||
|
|
||||||
@@ -161,9 +157,9 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
|
|||||||
)
|
)
|
||||||
|
|
||||||
cache_status = (
|
cache_status = (
|
||||||
"MISS" if _should_check_cache(openpipe_options, kwargs) else "SKIP"
|
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
|
||||||
)
|
)
|
||||||
chat_completion["openpipe"] = openpipe_meta(cache_status=cache_status)
|
chat_completion["openpipe"] = OpenPipeMeta(cache_status=cache_status)
|
||||||
|
|
||||||
return chat_completion
|
return chat_completion
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
|
|||||||
@@ -1,2 +1,7 @@
|
|||||||
def openpipe_meta(cache_status: str):
|
from attr import dataclass
|
||||||
return {"cache_status": cache_status}
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class OpenPipeMeta:
|
||||||
|
# Cache status. One of 'HIT', 'MISS', 'SKIP'
|
||||||
|
cache_status: str
|
||||||
|
|||||||
@@ -8,7 +8,6 @@ from openpipe.api_client.models.report_json_body_tags import (
|
|||||||
)
|
)
|
||||||
import toml
|
import toml
|
||||||
import time
|
import time
|
||||||
import os
|
|
||||||
|
|
||||||
version = toml.load("pyproject.toml")["tool"]["poetry"]["version"]
|
version = toml.load("pyproject.toml")["tool"]["poetry"]["version"]
|
||||||
|
|
||||||
@@ -16,9 +15,6 @@ configured_client = AuthenticatedClient(
|
|||||||
base_url="https://app.openpipe.ai/api/v1", token=""
|
base_url="https://app.openpipe.ai/api/v1", token=""
|
||||||
)
|
)
|
||||||
|
|
||||||
if os.environ.get("OPENPIPE_API_KEY"):
|
|
||||||
configured_client.token = os.environ["OPENPIPE_API_KEY"]
|
|
||||||
|
|
||||||
|
|
||||||
def _get_tags(openpipe_options):
|
def _get_tags(openpipe_options):
|
||||||
tags = openpipe_options.get("tags") or {}
|
tags = openpipe_options.get("tags") or {}
|
||||||
@@ -28,18 +24,10 @@ def _get_tags(openpipe_options):
|
|||||||
return ReportJsonBodyTags.from_dict(tags)
|
return ReportJsonBodyTags.from_dict(tags)
|
||||||
|
|
||||||
|
|
||||||
def _should_check_cache(openpipe_options, req_payload):
|
def _should_check_cache(openpipe_options):
|
||||||
if configured_client.token == "":
|
if configured_client.token == "":
|
||||||
return False
|
return False
|
||||||
|
return openpipe_options.get("cache", False)
|
||||||
cache_requested = openpipe_options.get("cache", False)
|
|
||||||
streaming = req_payload.get("stream", False)
|
|
||||||
if cache_requested and streaming:
|
|
||||||
print(
|
|
||||||
"Caching is not yet supported for streaming requests. Ignoring cache flag. Vote for this feature at https://github.com/OpenPipe/OpenPipe/issues/159"
|
|
||||||
)
|
|
||||||
return False
|
|
||||||
return cache_requested
|
|
||||||
|
|
||||||
|
|
||||||
def _process_cache_payload(
|
def _process_cache_payload(
|
||||||
@@ -56,7 +44,7 @@ def maybe_check_cache(
|
|||||||
openpipe_options={},
|
openpipe_options={},
|
||||||
req_payload={},
|
req_payload={},
|
||||||
):
|
):
|
||||||
if not _should_check_cache(openpipe_options, req_payload):
|
if not _should_check_cache(openpipe_options):
|
||||||
return None
|
return None
|
||||||
try:
|
try:
|
||||||
payload = check_cache.sync(
|
payload = check_cache.sync(
|
||||||
@@ -80,7 +68,7 @@ async def maybe_check_cache_async(
|
|||||||
openpipe_options={},
|
openpipe_options={},
|
||||||
req_payload={},
|
req_payload={},
|
||||||
):
|
):
|
||||||
if not _should_check_cache(openpipe_options, req_payload):
|
if not _should_check_cache(openpipe_options):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|||||||
@@ -27,14 +27,12 @@ def last_logged_call():
|
|||||||
return local_testing_only_get_latest_logged_call.sync(client=configured_client)
|
return local_testing_only_get_latest_logged_call.sync(client=configured_client)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.focus
|
|
||||||
def test_sync():
|
def test_sync():
|
||||||
completion = openai.ChatCompletion.create(
|
completion = openai.ChatCompletion.create(
|
||||||
model="gpt-3.5-turbo",
|
model="gpt-3.5-turbo",
|
||||||
messages=[{"role": "system", "content": "count to 3"}],
|
messages=[{"role": "system", "content": "count to 3"}],
|
||||||
)
|
)
|
||||||
|
|
||||||
print("completion is", completion)
|
|
||||||
last_logged = last_logged_call()
|
last_logged = last_logged_call()
|
||||||
assert (
|
assert (
|
||||||
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
|
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
|
||||||
@@ -44,7 +42,7 @@ def test_sync():
|
|||||||
last_logged.model_response.req_payload["messages"][0]["content"] == "count to 3"
|
last_logged.model_response.req_payload["messages"][0]["content"] == "count to 3"
|
||||||
)
|
)
|
||||||
|
|
||||||
assert completion.openpipe["cache_status"] == "SKIP"
|
assert completion.openpipe.cache_status == "SKIP"
|
||||||
|
|
||||||
|
|
||||||
def test_streaming():
|
def test_streaming():
|
||||||
@@ -77,7 +75,7 @@ async def test_async():
|
|||||||
== "count down from 5"
|
== "count down from 5"
|
||||||
)
|
)
|
||||||
|
|
||||||
assert completion.openpipe["cache_status"] == "SKIP"
|
assert completion.openpipe.cache_status == "SKIP"
|
||||||
|
|
||||||
|
|
||||||
async def test_async_streaming():
|
async def test_async_streaming():
|
||||||
@@ -89,7 +87,7 @@ async def test_async_streaming():
|
|||||||
|
|
||||||
merged = None
|
merged = None
|
||||||
async for chunk in completion:
|
async for chunk in completion:
|
||||||
assert chunk.openpipe["cache_status"] == "SKIP"
|
assert chunk.openpipe.cache_status == "SKIP"
|
||||||
merged = merge_openai_chunks(merged, chunk)
|
merged = merge_openai_chunks(merged, chunk)
|
||||||
|
|
||||||
last_logged = last_logged_call()
|
last_logged = last_logged_call()
|
||||||
@@ -102,7 +100,7 @@ async def test_async_streaming():
|
|||||||
last_logged.model_response.req_payload["messages"][0]["content"]
|
last_logged.model_response.req_payload["messages"][0]["content"]
|
||||||
== "count down from 5"
|
== "count down from 5"
|
||||||
)
|
)
|
||||||
assert merged["openpipe"]["cache_status"] == "SKIP"
|
assert merged["openpipe"].cache_status == "SKIP"
|
||||||
|
|
||||||
|
|
||||||
def test_sync_with_tags():
|
def test_sync_with_tags():
|
||||||
@@ -148,7 +146,7 @@ async def test_caching():
|
|||||||
messages=messages,
|
messages=messages,
|
||||||
openpipe={"cache": True},
|
openpipe={"cache": True},
|
||||||
)
|
)
|
||||||
assert completion.openpipe["cache_status"] == "MISS"
|
assert completion.openpipe.cache_status == "MISS"
|
||||||
|
|
||||||
first_logged = last_logged_call()
|
first_logged = last_logged_call()
|
||||||
assert (
|
assert (
|
||||||
@@ -161,4 +159,4 @@ async def test_caching():
|
|||||||
messages=messages,
|
messages=messages,
|
||||||
openpipe={"cache": True},
|
openpipe={"cache": True},
|
||||||
)
|
)
|
||||||
assert completion2.openpipe["cache_status"] == "HIT"
|
assert completion2.openpipe.cache_status == "HIT"
|
||||||
|
|||||||
@@ -1,12 +1,9 @@
|
|||||||
[tool.poetry]
|
[tool.poetry]
|
||||||
name = "openpipe"
|
name = "openpipe"
|
||||||
version = "3.0.1"
|
version = "0.1.0"
|
||||||
description = "Python client library for the OpenPipe service"
|
description = ""
|
||||||
authors = ["Kyle Corbitt <kyle@openpipe.ai>"]
|
authors = ["Kyle Corbitt <kyle@corbt.com>"]
|
||||||
license = "Apache-2.0"
|
license = "Apache-2.0"
|
||||||
readme = "README.md"
|
|
||||||
homepage = "https://github.com/OpenPipe/OpenPipe"
|
|
||||||
repository = "https://github.com/OpenPipe/OpenPipe"
|
|
||||||
|
|
||||||
[tool.poetry.dependencies]
|
[tool.poetry.dependencies]
|
||||||
python = "^3.9"
|
python = "^3.9"
|
||||||
|
|||||||
@@ -13,17 +13,15 @@
|
|||||||
"author": "",
|
"author": "",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"encoding": "^0.1.13",
|
|
||||||
"form-data": "^4.0.0",
|
"form-data": "^4.0.0",
|
||||||
"lodash-es": "^4.17.21",
|
"lodash-es": "^4.17.21",
|
||||||
"node-fetch": "^2.6.12",
|
"node-fetch": "^3.3.2",
|
||||||
"openai-beta": "npm:openai@4.0.0-beta.7",
|
"openai-beta": "npm:openai@4.0.0-beta.7",
|
||||||
"openai-legacy": "npm:openai@3.3.0"
|
"openai-legacy": "npm:openai@3.3.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@types/lodash-es": "^4.17.8",
|
"@types/lodash-es": "^4.17.8",
|
||||||
"@types/node": "^20.4.8",
|
"@types/node": "^20.4.8",
|
||||||
"@types/node-fetch": "^2.6.4",
|
|
||||||
"dotenv": "^16.3.1",
|
"dotenv": "^16.3.1",
|
||||||
"tsx": "^3.12.7",
|
"tsx": "^3.12.7",
|
||||||
"typescript": "^5.0.4",
|
"typescript": "^5.0.4",
|
||||||
|
|||||||
@@ -2,283 +2,301 @@
|
|||||||
/* istanbul ignore file */
|
/* istanbul ignore file */
|
||||||
/* tslint:disable */
|
/* tslint:disable */
|
||||||
/* eslint-disable */
|
/* eslint-disable */
|
||||||
import FormData from 'form-data';
|
import FormData from "form-data";
|
||||||
import fetch, { Headers } from 'node-fetch';
|
import fetch, { Headers } from "node-fetch";
|
||||||
import type { RequestInit, Response } from 'node-fetch';
|
import type { RequestInit, Response } from "node-fetch";
|
||||||
import type { AbortSignal } from 'node-fetch/externals';
|
|
||||||
|
|
||||||
import { ApiError } from './ApiError';
|
// @ts-expect-error TODO maybe I need an older node-fetch or something?
|
||||||
import type { ApiRequestOptions } from './ApiRequestOptions';
|
import type { AbortSignal } from "node-fetch/externals";
|
||||||
import type { ApiResult } from './ApiResult';
|
|
||||||
import { CancelablePromise } from './CancelablePromise';
|
|
||||||
import type { OnCancel } from './CancelablePromise';
|
|
||||||
import type { OpenAPIConfig } from './OpenAPI';
|
|
||||||
|
|
||||||
export const isDefined = <T>(value: T | null | undefined): value is Exclude<T, null | undefined> => {
|
import { ApiError } from "./ApiError";
|
||||||
return value !== undefined && value !== null;
|
import type { ApiRequestOptions } from "./ApiRequestOptions";
|
||||||
|
import type { ApiResult } from "./ApiResult";
|
||||||
|
import { CancelablePromise } from "./CancelablePromise";
|
||||||
|
import type { OnCancel } from "./CancelablePromise";
|
||||||
|
import type { OpenAPIConfig } from "./OpenAPI";
|
||||||
|
|
||||||
|
export const isDefined = <T>(
|
||||||
|
value: T | null | undefined
|
||||||
|
): value is Exclude<T, null | undefined> => {
|
||||||
|
return value !== undefined && value !== null;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const isString = (value: any): value is string => {
|
export const isString = (value: any): value is string => {
|
||||||
return typeof value === 'string';
|
return typeof value === "string";
|
||||||
};
|
};
|
||||||
|
|
||||||
export const isStringWithValue = (value: any): value is string => {
|
export const isStringWithValue = (value: any): value is string => {
|
||||||
return isString(value) && value !== '';
|
return isString(value) && value !== "";
|
||||||
};
|
};
|
||||||
|
|
||||||
export const isBlob = (value: any): value is Blob => {
|
export const isBlob = (value: any): value is Blob => {
|
||||||
return (
|
return (
|
||||||
typeof value === 'object' &&
|
typeof value === "object" &&
|
||||||
typeof value.type === 'string' &&
|
typeof value.type === "string" &&
|
||||||
typeof value.stream === 'function' &&
|
typeof value.stream === "function" &&
|
||||||
typeof value.arrayBuffer === 'function' &&
|
typeof value.arrayBuffer === "function" &&
|
||||||
typeof value.constructor === 'function' &&
|
typeof value.constructor === "function" &&
|
||||||
typeof value.constructor.name === 'string' &&
|
typeof value.constructor.name === "string" &&
|
||||||
/^(Blob|File)$/.test(value.constructor.name) &&
|
/^(Blob|File)$/.test(value.constructor.name) &&
|
||||||
/^(Blob|File)$/.test(value[Symbol.toStringTag])
|
/^(Blob|File)$/.test(value[Symbol.toStringTag])
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
export const isFormData = (value: any): value is FormData => {
|
export const isFormData = (value: any): value is FormData => {
|
||||||
return value instanceof FormData;
|
return value instanceof FormData;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const base64 = (str: string): string => {
|
export const base64 = (str: string): string => {
|
||||||
try {
|
try {
|
||||||
return btoa(str);
|
return btoa(str);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
return Buffer.from(str).toString('base64');
|
return Buffer.from(str).toString("base64");
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getQueryString = (params: Record<string, any>): string => {
|
export const getQueryString = (params: Record<string, any>): string => {
|
||||||
const qs: string[] = [];
|
const qs: string[] = [];
|
||||||
|
|
||||||
const append = (key: string, value: any) => {
|
const append = (key: string, value: any) => {
|
||||||
qs.push(`${encodeURIComponent(key)}=${encodeURIComponent(String(value))}`);
|
qs.push(`${encodeURIComponent(key)}=${encodeURIComponent(String(value))}`);
|
||||||
};
|
};
|
||||||
|
|
||||||
const process = (key: string, value: any) => {
|
const process = (key: string, value: any) => {
|
||||||
if (isDefined(value)) {
|
if (isDefined(value)) {
|
||||||
if (Array.isArray(value)) {
|
if (Array.isArray(value)) {
|
||||||
value.forEach(v => {
|
value.forEach((v) => {
|
||||||
process(key, v);
|
process(key, v);
|
||||||
});
|
});
|
||||||
} else if (typeof value === 'object') {
|
} else if (typeof value === "object") {
|
||||||
Object.entries(value).forEach(([k, v]) => {
|
Object.entries(value).forEach(([k, v]) => {
|
||||||
process(`${key}[${k}]`, v);
|
process(`${key}[${k}]`, v);
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
append(key, value);
|
append(key, value);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
Object.entries(params).forEach(([key, value]) => {
|
|
||||||
process(key, value);
|
|
||||||
});
|
|
||||||
|
|
||||||
if (qs.length > 0) {
|
|
||||||
return `?${qs.join('&')}`;
|
|
||||||
}
|
}
|
||||||
|
};
|
||||||
|
|
||||||
return '';
|
Object.entries(params).forEach(([key, value]) => {
|
||||||
|
process(key, value);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (qs.length > 0) {
|
||||||
|
return `?${qs.join("&")}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
return "";
|
||||||
};
|
};
|
||||||
|
|
||||||
const getUrl = (config: OpenAPIConfig, options: ApiRequestOptions): string => {
|
const getUrl = (config: OpenAPIConfig, options: ApiRequestOptions): string => {
|
||||||
const encoder = config.ENCODE_PATH || encodeURI;
|
const encoder = config.ENCODE_PATH || encodeURI;
|
||||||
|
|
||||||
const path = options.url
|
const path = options.url
|
||||||
.replace('{api-version}', config.VERSION)
|
.replace("{api-version}", config.VERSION)
|
||||||
.replace(/{(.*?)}/g, (substring: string, group: string) => {
|
.replace(/{(.*?)}/g, (substring: string, group: string) => {
|
||||||
if (options.path?.hasOwnProperty(group)) {
|
if (options.path?.hasOwnProperty(group)) {
|
||||||
return encoder(String(options.path[group]));
|
return encoder(String(options.path[group]));
|
||||||
}
|
}
|
||||||
return substring;
|
return substring;
|
||||||
});
|
});
|
||||||
|
|
||||||
const url = `${config.BASE}${path}`;
|
const url = `${config.BASE}${path}`;
|
||||||
if (options.query) {
|
if (options.query) {
|
||||||
return `${url}${getQueryString(options.query)}`;
|
return `${url}${getQueryString(options.query)}`;
|
||||||
}
|
}
|
||||||
return url;
|
return url;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getFormData = (options: ApiRequestOptions): FormData | undefined => {
|
export const getFormData = (options: ApiRequestOptions): FormData | undefined => {
|
||||||
if (options.formData) {
|
if (options.formData) {
|
||||||
const formData = new FormData();
|
const formData = new FormData();
|
||||||
|
|
||||||
const process = (key: string, value: any) => {
|
const process = (key: string, value: any) => {
|
||||||
if (isString(value) || isBlob(value)) {
|
if (isString(value) || isBlob(value)) {
|
||||||
formData.append(key, value);
|
formData.append(key, value);
|
||||||
} else {
|
} else {
|
||||||
formData.append(key, JSON.stringify(value));
|
formData.append(key, JSON.stringify(value));
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
Object.entries(options.formData)
|
Object.entries(options.formData)
|
||||||
.filter(([_, value]) => isDefined(value))
|
.filter(([_, value]) => isDefined(value))
|
||||||
.forEach(([key, value]) => {
|
.forEach(([key, value]) => {
|
||||||
if (Array.isArray(value)) {
|
if (Array.isArray(value)) {
|
||||||
value.forEach(v => process(key, v));
|
value.forEach((v) => process(key, v));
|
||||||
} else {
|
} else {
|
||||||
process(key, value);
|
process(key, value);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
return formData;
|
return formData;
|
||||||
}
|
}
|
||||||
return undefined;
|
return undefined;
|
||||||
};
|
};
|
||||||
|
|
||||||
type Resolver<T> = (options: ApiRequestOptions) => Promise<T>;
|
type Resolver<T> = (options: ApiRequestOptions) => Promise<T>;
|
||||||
|
|
||||||
export const resolve = async <T>(options: ApiRequestOptions, resolver?: T | Resolver<T>): Promise<T | undefined> => {
|
export const resolve = async <T>(
|
||||||
if (typeof resolver === 'function') {
|
options: ApiRequestOptions,
|
||||||
return (resolver as Resolver<T>)(options);
|
resolver?: T | Resolver<T>
|
||||||
}
|
): Promise<T | undefined> => {
|
||||||
return resolver;
|
if (typeof resolver === "function") {
|
||||||
|
return (resolver as Resolver<T>)(options);
|
||||||
|
}
|
||||||
|
return resolver;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getHeaders = async (config: OpenAPIConfig, options: ApiRequestOptions): Promise<Headers> => {
|
export const getHeaders = async (
|
||||||
const token = await resolve(options, config.TOKEN);
|
config: OpenAPIConfig,
|
||||||
const username = await resolve(options, config.USERNAME);
|
options: ApiRequestOptions
|
||||||
const password = await resolve(options, config.PASSWORD);
|
): Promise<Headers> => {
|
||||||
const additionalHeaders = await resolve(options, config.HEADERS);
|
const token = await resolve(options, config.TOKEN);
|
||||||
|
const username = await resolve(options, config.USERNAME);
|
||||||
|
const password = await resolve(options, config.PASSWORD);
|
||||||
|
const additionalHeaders = await resolve(options, config.HEADERS);
|
||||||
|
|
||||||
const headers = Object.entries({
|
const headers = Object.entries({
|
||||||
Accept: 'application/json',
|
Accept: "application/json",
|
||||||
...additionalHeaders,
|
...additionalHeaders,
|
||||||
...options.headers,
|
...options.headers,
|
||||||
})
|
})
|
||||||
.filter(([_, value]) => isDefined(value))
|
.filter(([_, value]) => isDefined(value))
|
||||||
.reduce((headers, [key, value]) => ({
|
.reduce(
|
||||||
...headers,
|
(headers, [key, value]) => ({
|
||||||
[key]: String(value),
|
...headers,
|
||||||
}), {} as Record<string, string>);
|
[key]: String(value),
|
||||||
|
}),
|
||||||
|
{} as Record<string, string>
|
||||||
|
);
|
||||||
|
|
||||||
if (isStringWithValue(token)) {
|
if (isStringWithValue(token)) {
|
||||||
headers['Authorization'] = `Bearer ${token}`;
|
headers["Authorization"] = `Bearer ${token}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isStringWithValue(username) && isStringWithValue(password)) {
|
||||||
|
const credentials = base64(`${username}:${password}`);
|
||||||
|
headers["Authorization"] = `Basic ${credentials}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (options.body) {
|
||||||
|
if (options.mediaType) {
|
||||||
|
headers["Content-Type"] = options.mediaType;
|
||||||
|
} else if (isBlob(options.body)) {
|
||||||
|
headers["Content-Type"] = "application/octet-stream";
|
||||||
|
} else if (isString(options.body)) {
|
||||||
|
headers["Content-Type"] = "text/plain";
|
||||||
|
} else if (!isFormData(options.body)) {
|
||||||
|
headers["Content-Type"] = "application/json";
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (isStringWithValue(username) && isStringWithValue(password)) {
|
return new Headers(headers);
|
||||||
const credentials = base64(`${username}:${password}`);
|
|
||||||
headers['Authorization'] = `Basic ${credentials}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (options.body) {
|
|
||||||
if (options.mediaType) {
|
|
||||||
headers['Content-Type'] = options.mediaType;
|
|
||||||
} else if (isBlob(options.body)) {
|
|
||||||
headers['Content-Type'] = 'application/octet-stream';
|
|
||||||
} else if (isString(options.body)) {
|
|
||||||
headers['Content-Type'] = 'text/plain';
|
|
||||||
} else if (!isFormData(options.body)) {
|
|
||||||
headers['Content-Type'] = 'application/json';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return new Headers(headers);
|
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getRequestBody = (options: ApiRequestOptions): any => {
|
export const getRequestBody = (options: ApiRequestOptions): any => {
|
||||||
if (options.body !== undefined) {
|
if (options.body !== undefined) {
|
||||||
if (options.mediaType?.includes('/json')) {
|
if (options.mediaType?.includes("/json")) {
|
||||||
return JSON.stringify(options.body)
|
return JSON.stringify(options.body);
|
||||||
} else if (isString(options.body) || isBlob(options.body) || isFormData(options.body)) {
|
} else if (isString(options.body) || isBlob(options.body) || isFormData(options.body)) {
|
||||||
return options.body as any;
|
return options.body as any;
|
||||||
} else {
|
} else {
|
||||||
return JSON.stringify(options.body);
|
return JSON.stringify(options.body);
|
||||||
}
|
|
||||||
}
|
}
|
||||||
return undefined;
|
}
|
||||||
|
return undefined;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const sendRequest = async (
|
export const sendRequest = async (
|
||||||
options: ApiRequestOptions,
|
options: ApiRequestOptions,
|
||||||
url: string,
|
url: string,
|
||||||
body: any,
|
body: any,
|
||||||
formData: FormData | undefined,
|
formData: FormData | undefined,
|
||||||
headers: Headers,
|
headers: Headers,
|
||||||
onCancel: OnCancel
|
onCancel: OnCancel
|
||||||
): Promise<Response> => {
|
): Promise<Response> => {
|
||||||
const controller = new AbortController();
|
const controller = new AbortController();
|
||||||
|
|
||||||
const request: RequestInit = {
|
const request: RequestInit = {
|
||||||
headers,
|
headers,
|
||||||
method: options.method,
|
method: options.method,
|
||||||
body: body ?? formData,
|
body: body ?? formData,
|
||||||
signal: controller.signal as AbortSignal,
|
signal: controller.signal as AbortSignal,
|
||||||
};
|
};
|
||||||
|
|
||||||
onCancel(() => controller.abort());
|
onCancel(() => controller.abort());
|
||||||
|
|
||||||
return await fetch(url, request);
|
return await fetch(url, request);
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getResponseHeader = (response: Response, responseHeader?: string): string | undefined => {
|
export const getResponseHeader = (
|
||||||
if (responseHeader) {
|
response: Response,
|
||||||
const content = response.headers.get(responseHeader);
|
responseHeader?: string
|
||||||
if (isString(content)) {
|
): string | undefined => {
|
||||||
return content;
|
if (responseHeader) {
|
||||||
}
|
const content = response.headers.get(responseHeader);
|
||||||
|
if (isString(content)) {
|
||||||
|
return content;
|
||||||
}
|
}
|
||||||
return undefined;
|
}
|
||||||
|
return undefined;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getResponseBody = async (response: Response): Promise<any> => {
|
export const getResponseBody = async (response: Response): Promise<any> => {
|
||||||
if (response.status !== 204) {
|
if (response.status !== 204) {
|
||||||
try {
|
try {
|
||||||
const contentType = response.headers.get('Content-Type');
|
const contentType = response.headers.get("Content-Type");
|
||||||
if (contentType) {
|
if (contentType) {
|
||||||
const jsonTypes = ['application/json', 'application/problem+json']
|
const jsonTypes = ["application/json", "application/problem+json"];
|
||||||
const isJSON = jsonTypes.some(type => contentType.toLowerCase().startsWith(type));
|
const isJSON = jsonTypes.some((type) => contentType.toLowerCase().startsWith(type));
|
||||||
if (isJSON) {
|
if (isJSON) {
|
||||||
return await response.json();
|
return await response.json();
|
||||||
} else {
|
} else {
|
||||||
return await response.text();
|
return await response.text();
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error(error);
|
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(error);
|
||||||
}
|
}
|
||||||
return undefined;
|
}
|
||||||
|
return undefined;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const catchErrorCodes = (options: ApiRequestOptions, result: ApiResult): void => {
|
export const catchErrorCodes = (options: ApiRequestOptions, result: ApiResult): void => {
|
||||||
const errors: Record<number, string> = {
|
const errors: Record<number, string> = {
|
||||||
400: 'Bad Request',
|
400: "Bad Request",
|
||||||
401: 'Unauthorized',
|
401: "Unauthorized",
|
||||||
403: 'Forbidden',
|
403: "Forbidden",
|
||||||
404: 'Not Found',
|
404: "Not Found",
|
||||||
500: 'Internal Server Error',
|
500: "Internal Server Error",
|
||||||
502: 'Bad Gateway',
|
502: "Bad Gateway",
|
||||||
503: 'Service Unavailable',
|
503: "Service Unavailable",
|
||||||
...options.errors,
|
...options.errors,
|
||||||
}
|
};
|
||||||
|
|
||||||
const error = errors[result.status];
|
const error = errors[result.status];
|
||||||
if (error) {
|
if (error) {
|
||||||
throw new ApiError(options, result, error);
|
throw new ApiError(options, result, error);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!result.ok) {
|
if (!result.ok) {
|
||||||
const errorStatus = result.status ?? 'unknown';
|
const errorStatus = result.status ?? "unknown";
|
||||||
const errorStatusText = result.statusText ?? 'unknown';
|
const errorStatusText = result.statusText ?? "unknown";
|
||||||
const errorBody = (() => {
|
const errorBody = (() => {
|
||||||
try {
|
try {
|
||||||
return JSON.stringify(result.body, null, 2);
|
return JSON.stringify(result.body, null, 2);
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
})();
|
})();
|
||||||
|
|
||||||
throw new ApiError(options, result,
|
throw new ApiError(
|
||||||
`Generic Error: status: ${errorStatus}; status text: ${errorStatusText}; body: ${errorBody}`
|
options,
|
||||||
);
|
result,
|
||||||
}
|
`Generic Error: status: ${errorStatus}; status text: ${errorStatusText}; body: ${errorBody}`
|
||||||
|
);
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -288,33 +306,36 @@ export const catchErrorCodes = (options: ApiRequestOptions, result: ApiResult):
|
|||||||
* @returns CancelablePromise<T>
|
* @returns CancelablePromise<T>
|
||||||
* @throws ApiError
|
* @throws ApiError
|
||||||
*/
|
*/
|
||||||
export const request = <T>(config: OpenAPIConfig, options: ApiRequestOptions): CancelablePromise<T> => {
|
export const request = <T>(
|
||||||
return new CancelablePromise(async (resolve, reject, onCancel) => {
|
config: OpenAPIConfig,
|
||||||
try {
|
options: ApiRequestOptions
|
||||||
const url = getUrl(config, options);
|
): CancelablePromise<T> => {
|
||||||
const formData = getFormData(options);
|
return new CancelablePromise(async (resolve, reject, onCancel) => {
|
||||||
const body = getRequestBody(options);
|
try {
|
||||||
const headers = await getHeaders(config, options);
|
const url = getUrl(config, options);
|
||||||
|
const formData = getFormData(options);
|
||||||
|
const body = getRequestBody(options);
|
||||||
|
const headers = await getHeaders(config, options);
|
||||||
|
|
||||||
if (!onCancel.isCancelled) {
|
if (!onCancel.isCancelled) {
|
||||||
const response = await sendRequest(options, url, body, formData, headers, onCancel);
|
const response = await sendRequest(options, url, body, formData, headers, onCancel);
|
||||||
const responseBody = await getResponseBody(response);
|
const responseBody = await getResponseBody(response);
|
||||||
const responseHeader = getResponseHeader(response, options.responseHeader);
|
const responseHeader = getResponseHeader(response, options.responseHeader);
|
||||||
|
|
||||||
const result: ApiResult = {
|
const result: ApiResult = {
|
||||||
url,
|
url,
|
||||||
ok: response.ok,
|
ok: response.ok,
|
||||||
status: response.status,
|
status: response.status,
|
||||||
statusText: response.statusText,
|
statusText: response.statusText,
|
||||||
body: responseHeader ?? responseBody,
|
body: responseHeader ?? responseBody,
|
||||||
};
|
};
|
||||||
|
|
||||||
catchErrorCodes(options, result);
|
catchErrorCodes(options, result);
|
||||||
|
|
||||||
resolve(result.body);
|
resolve(result.body);
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
reject(error);
|
reject(error);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -82,7 +82,7 @@ export class DefaultService {
|
|||||||
tags?: Record<string, string>;
|
tags?: Record<string, string>;
|
||||||
},
|
},
|
||||||
): CancelablePromise<{
|
): CancelablePromise<{
|
||||||
status: ('ok' | 'error');
|
status: 'ok';
|
||||||
}> {
|
}> {
|
||||||
return this.httpRequest.request({
|
return this.httpRequest.request({
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
|
|||||||
@@ -2,13 +2,10 @@ import dotenv from "dotenv";
|
|||||||
import { expect, test } from "vitest";
|
import { expect, test } from "vitest";
|
||||||
import OpenAI from ".";
|
import OpenAI from ".";
|
||||||
import {
|
import {
|
||||||
ChatCompletion,
|
|
||||||
CompletionCreateParams,
|
CompletionCreateParams,
|
||||||
CreateChatCompletionRequestMessage,
|
CreateChatCompletionRequestMessage,
|
||||||
} from "openai-beta/resources/chat/completions";
|
} from "openai-beta/resources/chat/completions";
|
||||||
import { OPClient } from "../codegen";
|
import { OPClient } from "../codegen";
|
||||||
import mergeChunks from "./mergeChunks";
|
|
||||||
import assert from "assert";
|
|
||||||
|
|
||||||
dotenv.config({ path: "../.env" });
|
dotenv.config({ path: "../.env" });
|
||||||
|
|
||||||
@@ -34,7 +31,9 @@ test("basic call", async () => {
|
|||||||
};
|
};
|
||||||
const completion = await oaiClient.chat.completions.create({
|
const completion = await oaiClient.chat.completions.create({
|
||||||
...payload,
|
...payload,
|
||||||
openpipe: { tags: { promptId: "test" } },
|
openpipe: {
|
||||||
|
tags: { promptId: "test" },
|
||||||
|
},
|
||||||
});
|
});
|
||||||
await completion.openpipe.reportingFinished;
|
await completion.openpipe.reportingFinished;
|
||||||
const lastLogged = await lastLoggedCall();
|
const lastLogged = await lastLoggedCall();
|
||||||
@@ -47,32 +46,29 @@ const randomString = (length: number) => {
|
|||||||
const characters = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
|
const characters = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
|
||||||
return Array.from(
|
return Array.from(
|
||||||
{ length },
|
{ length },
|
||||||
() => characters[Math.floor(Math.random() * characters.length)],
|
() => characters[Math.floor(Math.random() * characters.length)]
|
||||||
).join("");
|
).join("");
|
||||||
};
|
};
|
||||||
|
|
||||||
test("streaming", async () => {
|
test.skip("streaming", async () => {
|
||||||
const completion = await oaiClient.chat.completions.create({
|
const completion = await oaiClient.chat.completions.create({
|
||||||
model: "gpt-3.5-turbo",
|
model: "gpt-3.5-turbo",
|
||||||
messages: [{ role: "system", content: "count to 3" }],
|
messages: [{ role: "system", content: "count to 4" }],
|
||||||
stream: true,
|
stream: true,
|
||||||
});
|
});
|
||||||
|
|
||||||
let merged: ChatCompletion | null = null;
|
let merged = null;
|
||||||
for await (const chunk of completion) {
|
for await (const chunk of completion) {
|
||||||
merged = mergeChunks(merged, chunk);
|
merged = merge_openai_chunks(merged, chunk);
|
||||||
}
|
}
|
||||||
|
|
||||||
const lastLogged = await lastLoggedCall();
|
const lastLogged = await lastLoggedCall();
|
||||||
await completion.openpipe.reportingFinished;
|
expect(lastLogged?.modelResponse?.respPayload.choices[0].message.content).toBe(
|
||||||
|
merged.choices[0].message.content
|
||||||
expect(merged).toMatchObject(lastLogged?.modelResponse?.respPayload);
|
);
|
||||||
expect(lastLogged?.modelResponse?.reqPayload.messages).toMatchObject([
|
|
||||||
{ role: "system", content: "count to 3" },
|
|
||||||
]);
|
|
||||||
});
|
});
|
||||||
|
|
||||||
test("bad call streaming", async () => {
|
test.skip("bad call streaming", async () => {
|
||||||
try {
|
try {
|
||||||
await oaiClient.chat.completions.create({
|
await oaiClient.chat.completions.create({
|
||||||
model: "gpt-3.5-turbo-blaster",
|
model: "gpt-3.5-turbo-blaster",
|
||||||
@@ -80,29 +76,26 @@ test("bad call streaming", async () => {
|
|||||||
stream: true,
|
stream: true,
|
||||||
});
|
});
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
await e.openpipe.reportingFinished;
|
|
||||||
const lastLogged = await lastLoggedCall();
|
const lastLogged = await lastLoggedCall();
|
||||||
expect(lastLogged?.modelResponse?.errorMessage).toEqual(
|
expect(lastLogged?.modelResponse?.errorMessage).toBe(
|
||||||
"The model `gpt-3.5-turbo-blaster` does not exist",
|
"The model `gpt-3.5-turbo-blaster` does not exist"
|
||||||
);
|
);
|
||||||
expect(lastLogged?.modelResponse?.statusCode).toEqual(404);
|
expect(lastLogged?.modelResponse?.statusCode).toBe(404);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
test("bad call", async () => {
|
test("bad call", async () => {
|
||||||
try {
|
try {
|
||||||
await oaiClient.chat.completions.create({
|
await oaiClient.chat.completions.create({
|
||||||
model: "gpt-3.5-turbo-buster",
|
model: "gpt-3.5-turbo-booster",
|
||||||
messages: [{ role: "system", content: "count to 10" }],
|
messages: [{ role: "system", content: "count to 10" }],
|
||||||
});
|
});
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
assert("openpipe" in e);
|
|
||||||
await e.openpipe.reportingFinished;
|
|
||||||
const lastLogged = await lastLoggedCall();
|
const lastLogged = await lastLoggedCall();
|
||||||
expect(lastLogged?.modelResponse?.errorMessage).toEqual(
|
expect(lastLogged?.modelResponse?.errorMessage).toBe(
|
||||||
"The model `gpt-3.5-turbo-buster` does not exist",
|
"The model `gpt-3.5-turbo-booster` does not exist"
|
||||||
);
|
);
|
||||||
expect(lastLogged?.modelResponse?.statusCode).toEqual(404);
|
expect(lastLogged?.modelResponse?.statusCode).toBe(404);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -116,12 +109,12 @@ test("caching", async () => {
|
|||||||
messages: [message],
|
messages: [message],
|
||||||
openpipe: { cache: true },
|
openpipe: { cache: true },
|
||||||
});
|
});
|
||||||
expect(completion.openpipe.cacheStatus).toEqual("MISS");
|
expect(completion.openpipe.cacheStatus).toBe("MISS");
|
||||||
|
|
||||||
await completion.openpipe.reportingFinished;
|
await completion.openpipe.reportingFinished;
|
||||||
const firstLogged = await lastLoggedCall();
|
const firstLogged = await lastLoggedCall();
|
||||||
expect(completion.choices[0].message.content).toEqual(
|
expect(completion.choices[0].message.content).toBe(
|
||||||
firstLogged?.modelResponse?.respPayload.choices[0].message.content,
|
firstLogged?.modelResponse?.respPayload.choices[0].message.content
|
||||||
);
|
);
|
||||||
|
|
||||||
const completion2 = await oaiClient.chat.completions.create({
|
const completion2 = await oaiClient.chat.completions.create({
|
||||||
@@ -129,5 +122,5 @@ test("caching", async () => {
|
|||||||
messages: [message],
|
messages: [message],
|
||||||
openpipe: { cache: true },
|
openpipe: { cache: true },
|
||||||
});
|
});
|
||||||
expect(completion2.openpipe.cacheStatus).toEqual("HIT");
|
expect(completion2.openpipe.cacheStatus).toBe("HIT");
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -5,9 +5,9 @@ import {
|
|||||||
ChatCompletion,
|
ChatCompletion,
|
||||||
ChatCompletionChunk,
|
ChatCompletionChunk,
|
||||||
CompletionCreateParams,
|
CompletionCreateParams,
|
||||||
|
Completions,
|
||||||
} from "openai-beta/resources/chat/completions";
|
} from "openai-beta/resources/chat/completions";
|
||||||
|
|
||||||
import { WrappedStream } from "./streaming";
|
|
||||||
import { DefaultService, OPClient } from "../codegen";
|
import { DefaultService, OPClient } from "../codegen";
|
||||||
import { Stream } from "openai-beta/streaming";
|
import { Stream } from "openai-beta/streaming";
|
||||||
import { OpenPipeArgs, OpenPipeMeta, type OpenPipeConfig, getTags } from "../shared";
|
import { OpenPipeArgs, OpenPipeMeta, type OpenPipeConfig, getTags } from "../shared";
|
||||||
@@ -27,11 +27,11 @@ export default class OpenAI extends openai.OpenAI {
|
|||||||
BASE:
|
BASE:
|
||||||
openpipe?.baseUrl ?? readEnv("OPENPIPE_BASE_URL") ?? "https://app.openpipe.ai/api/v1",
|
openpipe?.baseUrl ?? readEnv("OPENPIPE_BASE_URL") ?? "https://app.openpipe.ai/api/v1",
|
||||||
TOKEN: openPipeApiKey,
|
TOKEN: openPipeApiKey,
|
||||||
}),
|
})
|
||||||
);
|
);
|
||||||
} else {
|
} else {
|
||||||
console.warn(
|
console.warn(
|
||||||
"You're using the OpenPipe client without an API key. No completion requests will be logged.",
|
"You're using the OpenPipe client without an API key. No completion requests will be logged."
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -43,10 +43,10 @@ class WrappedChat extends openai.OpenAI.Chat {
|
|||||||
this.completions.opClient = client;
|
this.completions.opClient = client;
|
||||||
}
|
}
|
||||||
|
|
||||||
completions: WrappedCompletions = new WrappedCompletions(this.client);
|
completions: InstrumentedCompletions = new InstrumentedCompletions(this.client);
|
||||||
}
|
}
|
||||||
|
|
||||||
class WrappedCompletions extends openai.OpenAI.Chat.Completions {
|
class InstrumentedCompletions extends openai.OpenAI.Chat.Completions {
|
||||||
opClient?: OPClient;
|
opClient?: OPClient;
|
||||||
|
|
||||||
constructor(client: openai.OpenAI, opClient?: OPClient) {
|
constructor(client: openai.OpenAI, opClient?: OPClient) {
|
||||||
@@ -54,35 +54,32 @@ class WrappedCompletions extends openai.OpenAI.Chat.Completions {
|
|||||||
this.opClient = opClient;
|
this.opClient = opClient;
|
||||||
}
|
}
|
||||||
|
|
||||||
async _report(args: Parameters<DefaultService["report"]>[0]) {
|
_report(args: Parameters<DefaultService["report"]>[0]) {
|
||||||
try {
|
try {
|
||||||
this.opClient ? await this.opClient.default.report(args) : Promise.resolve();
|
return this.opClient ? this.opClient.default.report(args) : Promise.resolve();
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error(e);
|
console.error(e);
|
||||||
|
return Promise.resolve();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
create(
|
create(
|
||||||
body: CompletionCreateParams.CreateChatCompletionRequestNonStreaming & OpenPipeArgs,
|
body: CompletionCreateParams.CreateChatCompletionRequestNonStreaming & OpenPipeArgs,
|
||||||
options?: Core.RequestOptions,
|
options?: Core.RequestOptions
|
||||||
): Promise<Core.APIResponse<ChatCompletion & { openpipe: OpenPipeMeta }>>;
|
): Promise<Core.APIResponse<ChatCompletion & { openpipe: OpenPipeMeta }>>;
|
||||||
create(
|
create(
|
||||||
body: CompletionCreateParams.CreateChatCompletionRequestStreaming & OpenPipeArgs,
|
body: CompletionCreateParams.CreateChatCompletionRequestStreaming & OpenPipeArgs,
|
||||||
options?: Core.RequestOptions,
|
options?: Core.RequestOptions
|
||||||
): Promise<Core.APIResponse<WrappedStream>>;
|
): Promise<Core.APIResponse<Stream<ChatCompletionChunk>>>;
|
||||||
async create(
|
async create(
|
||||||
{ openpipe, ...body }: CompletionCreateParams & OpenPipeArgs,
|
{ openpipe, ...body }: CompletionCreateParams & OpenPipeArgs,
|
||||||
options?: Core.RequestOptions,
|
options?: Core.RequestOptions
|
||||||
): Promise<Core.APIResponse<(ChatCompletion & { openpipe: OpenPipeMeta }) | WrappedStream>> {
|
): Promise<
|
||||||
|
Core.APIResponse<(ChatCompletion & { openpipe: OpenPipeMeta }) | Stream<ChatCompletionChunk>>
|
||||||
|
> {
|
||||||
|
console.log("LALALA REPORT", this.opClient);
|
||||||
const requestedAt = Date.now();
|
const requestedAt = Date.now();
|
||||||
let reportingFinished: OpenPipeMeta["reportingFinished"] = Promise.resolve();
|
const cacheRequested = openpipe?.cache ?? false;
|
||||||
let cacheRequested = openpipe?.cache ?? false;
|
|
||||||
if (cacheRequested && body.stream) {
|
|
||||||
console.warn(
|
|
||||||
`Caching is not yet supported for streaming requests. Ignoring cache flag. Vote for this feature at https://github.com/OpenPipe/OpenPipe/issues/159`,
|
|
||||||
);
|
|
||||||
cacheRequested = false;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (cacheRequested) {
|
if (cacheRequested) {
|
||||||
try {
|
try {
|
||||||
@@ -95,13 +92,12 @@ class WrappedCompletions extends openai.OpenAI.Chat.Completions {
|
|||||||
.then((res) => res.respPayload);
|
.then((res) => res.respPayload);
|
||||||
|
|
||||||
if (cached) {
|
if (cached) {
|
||||||
const meta = {
|
|
||||||
cacheStatus: "HIT",
|
|
||||||
reportingFinished,
|
|
||||||
};
|
|
||||||
return {
|
return {
|
||||||
...cached,
|
...cached,
|
||||||
openpipe: meta,
|
openpipe: {
|
||||||
|
cacheStatus: "HIT",
|
||||||
|
reportingFinished: Promise.resolve(),
|
||||||
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
@@ -109,23 +105,15 @@ class WrappedCompletions extends openai.OpenAI.Chat.Completions {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let reportingFinished: OpenPipeMeta["reportingFinished"] = Promise.resolve();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
if (body.stream) {
|
if (body.stream) {
|
||||||
const stream = await super.create(body, options);
|
const stream = await super.create(body, options);
|
||||||
const wrappedStream = new WrappedStream(stream, (response) =>
|
|
||||||
this._report({
|
|
||||||
requestedAt,
|
|
||||||
receivedAt: Date.now(),
|
|
||||||
reqPayload: body,
|
|
||||||
respPayload: response,
|
|
||||||
statusCode: 200,
|
|
||||||
tags: getTags(openpipe),
|
|
||||||
}),
|
|
||||||
);
|
|
||||||
|
|
||||||
// Do some logging of each chunk here
|
// Do some logging of each chunk here
|
||||||
|
|
||||||
return wrappedStream;
|
return stream;
|
||||||
} else {
|
} else {
|
||||||
const response = await super.create(body, options);
|
const response = await super.create(body, options);
|
||||||
|
|
||||||
@@ -159,16 +147,6 @@ class WrappedCompletions extends openai.OpenAI.Chat.Completions {
|
|||||||
tags: getTags(openpipe),
|
tags: getTags(openpipe),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
// make sure error is an object we can add properties to
|
|
||||||
if (typeof error === "object" && error !== null) {
|
|
||||||
error = {
|
|
||||||
...error,
|
|
||||||
openpipe: {
|
|
||||||
cacheStatus: cacheRequested ? "MISS" : "SKIP",
|
|
||||||
reportingFinished,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,43 +0,0 @@
|
|||||||
import { ChatCompletion, ChatCompletionChunk } from "openai-beta/resources/chat";
|
|
||||||
import { Stream } from "openai-beta/streaming";
|
|
||||||
import { OpenPipeMeta } from "../shared";
|
|
||||||
import mergeChunks from "./mergeChunks";
|
|
||||||
|
|
||||||
export class WrappedStream extends Stream<ChatCompletionChunk> {
|
|
||||||
openpipe: OpenPipeMeta;
|
|
||||||
|
|
||||||
private resolveReportingFinished: () => void = () => {};
|
|
||||||
private report: (response: unknown) => Promise<void>;
|
|
||||||
|
|
||||||
constructor(stream: Stream<ChatCompletionChunk>, report: (response: unknown) => Promise<void>) {
|
|
||||||
super(stream.response, stream.controller);
|
|
||||||
this.report = report;
|
|
||||||
|
|
||||||
const reportingFinished = new Promise<void>((resolve) => {
|
|
||||||
this.resolveReportingFinished = resolve;
|
|
||||||
});
|
|
||||||
|
|
||||||
this.openpipe = {
|
|
||||||
cacheStatus: "MISS",
|
|
||||||
reportingFinished,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
async *[Symbol.asyncIterator](): AsyncIterator<ChatCompletionChunk, any, undefined> {
|
|
||||||
const iterator = super[Symbol.asyncIterator]();
|
|
||||||
|
|
||||||
let combinedResponse: ChatCompletion | null = null;
|
|
||||||
while (true) {
|
|
||||||
const result = await iterator.next();
|
|
||||||
if (result.done) break;
|
|
||||||
combinedResponse = mergeChunks(combinedResponse, result.value);
|
|
||||||
|
|
||||||
yield result.value;
|
|
||||||
}
|
|
||||||
|
|
||||||
await this.report(combinedResponse);
|
|
||||||
|
|
||||||
// Resolve the promise here
|
|
||||||
this.resolveReportingFinished();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,5 +1,4 @@
|
|||||||
import pkg from "../package.json";
|
import pkg from "../package.json";
|
||||||
import { DefaultService } from "./codegen";
|
|
||||||
|
|
||||||
export type OpenPipeConfig = {
|
export type OpenPipeConfig = {
|
||||||
apiKey?: string;
|
apiKey?: string;
|
||||||
@@ -16,11 +15,9 @@ export type OpenPipeMeta = {
|
|||||||
// We report your call to OpenPipe asynchronously in the background. If you
|
// We report your call to OpenPipe asynchronously in the background. If you
|
||||||
// need to wait until the report is sent to take further action, you can await
|
// need to wait until the report is sent to take further action, you can await
|
||||||
// this promise.
|
// this promise.
|
||||||
reportingFinished: Promise<void>;
|
reportingFinished: Promise<void | { status: "ok" }>;
|
||||||
};
|
};
|
||||||
|
|
||||||
export type ReportFn = (...args: Parameters<DefaultService["report"]>) => Promise<void>;
|
|
||||||
|
|
||||||
export const getTags = (args: OpenPipeArgs["openpipe"]): Record<string, string> => ({
|
export const getTags = (args: OpenPipeArgs["openpipe"]): Record<string, string> => ({
|
||||||
...args?.tags,
|
...args?.tags,
|
||||||
...(args?.cache ? { $cache: args.cache?.toString() } : {}),
|
...(args?.cache ? { $cache: args.cache?.toString() } : {}),
|
||||||
|
|||||||
289
pnpm-lock.yaml
generated
289
pnpm-lock.yaml
generated
@@ -80,9 +80,6 @@ importers:
|
|||||||
'@vercel/og':
|
'@vercel/og':
|
||||||
specifier: ^0.5.9
|
specifier: ^0.5.9
|
||||||
version: 0.5.9
|
version: 0.5.9
|
||||||
archiver:
|
|
||||||
specifier: ^6.0.0
|
|
||||||
version: 6.0.0
|
|
||||||
ast-types:
|
ast-types:
|
||||||
specifier: ^0.14.2
|
specifier: ^0.14.2
|
||||||
version: 0.14.2
|
version: 0.14.2
|
||||||
@@ -119,9 +116,6 @@ importers:
|
|||||||
graphile-worker:
|
graphile-worker:
|
||||||
specifier: ^0.13.0
|
specifier: ^0.13.0
|
||||||
version: 0.13.0
|
version: 0.13.0
|
||||||
human-id:
|
|
||||||
specifier: ^4.0.0
|
|
||||||
version: 4.0.0
|
|
||||||
immer:
|
immer:
|
||||||
specifier: ^10.0.2
|
specifier: ^10.0.2
|
||||||
version: 10.0.2
|
version: 10.0.2
|
||||||
@@ -172,7 +166,7 @@ importers:
|
|||||||
version: 6.9.4
|
version: 6.9.4
|
||||||
openai:
|
openai:
|
||||||
specifier: 4.0.0-beta.7
|
specifier: 4.0.0-beta.7
|
||||||
version: 4.0.0-beta.7(encoding@0.1.13)
|
version: 4.0.0-beta.7
|
||||||
openpipe:
|
openpipe:
|
||||||
specifier: workspace:*
|
specifier: workspace:*
|
||||||
version: link:../client-libs/typescript
|
version: link:../client-libs/typescript
|
||||||
@@ -236,9 +230,6 @@ importers:
|
|||||||
socket.io-client:
|
socket.io-client:
|
||||||
specifier: ^4.7.1
|
specifier: ^4.7.1
|
||||||
version: 4.7.1
|
version: 4.7.1
|
||||||
stream-buffers:
|
|
||||||
specifier: ^3.0.2
|
|
||||||
version: 3.0.2
|
|
||||||
superjson:
|
superjson:
|
||||||
specifier: 1.12.2
|
specifier: 1.12.2
|
||||||
version: 1.12.2
|
version: 1.12.2
|
||||||
@@ -270,9 +261,6 @@ importers:
|
|||||||
'@openapi-contrib/openapi-schema-to-json-schema':
|
'@openapi-contrib/openapi-schema-to-json-schema':
|
||||||
specifier: ^4.0.5
|
specifier: ^4.0.5
|
||||||
version: 4.0.5
|
version: 4.0.5
|
||||||
'@types/archiver':
|
|
||||||
specifier: ^5.3.2
|
|
||||||
version: 5.3.2
|
|
||||||
'@types/babel__core':
|
'@types/babel__core':
|
||||||
specifier: ^7.20.1
|
specifier: ^7.20.1
|
||||||
version: 7.20.1
|
version: 7.20.1
|
||||||
@@ -321,9 +309,6 @@ importers:
|
|||||||
'@types/react-syntax-highlighter':
|
'@types/react-syntax-highlighter':
|
||||||
specifier: ^15.5.7
|
specifier: ^15.5.7
|
||||||
version: 15.5.7
|
version: 15.5.7
|
||||||
'@types/stream-buffers':
|
|
||||||
specifier: ^3.0.4
|
|
||||||
version: 3.0.4
|
|
||||||
'@types/uuid':
|
'@types/uuid':
|
||||||
specifier: ^9.0.2
|
specifier: ^9.0.2
|
||||||
version: 9.0.2
|
version: 9.0.2
|
||||||
@@ -372,9 +357,6 @@ importers:
|
|||||||
|
|
||||||
client-libs/typescript:
|
client-libs/typescript:
|
||||||
dependencies:
|
dependencies:
|
||||||
encoding:
|
|
||||||
specifier: ^0.1.13
|
|
||||||
version: 0.1.13
|
|
||||||
form-data:
|
form-data:
|
||||||
specifier: ^4.0.0
|
specifier: ^4.0.0
|
||||||
version: 4.0.0
|
version: 4.0.0
|
||||||
@@ -382,11 +364,11 @@ importers:
|
|||||||
specifier: ^4.17.21
|
specifier: ^4.17.21
|
||||||
version: 4.17.21
|
version: 4.17.21
|
||||||
node-fetch:
|
node-fetch:
|
||||||
specifier: ^2.6.12
|
specifier: ^3.3.2
|
||||||
version: 2.6.12(encoding@0.1.13)
|
version: 3.3.2
|
||||||
openai-beta:
|
openai-beta:
|
||||||
specifier: npm:openai@4.0.0-beta.7
|
specifier: npm:openai@4.0.0-beta.7
|
||||||
version: /openai@4.0.0-beta.7(encoding@0.1.13)
|
version: /openai@4.0.0-beta.7
|
||||||
openai-legacy:
|
openai-legacy:
|
||||||
specifier: npm:openai@3.3.0
|
specifier: npm:openai@3.3.0
|
||||||
version: /openai@3.3.0
|
version: /openai@3.3.0
|
||||||
@@ -397,9 +379,6 @@ importers:
|
|||||||
'@types/node':
|
'@types/node':
|
||||||
specifier: ^20.4.8
|
specifier: ^20.4.8
|
||||||
version: 20.4.8
|
version: 20.4.8
|
||||||
'@types/node-fetch':
|
|
||||||
specifier: ^2.6.4
|
|
||||||
version: 2.6.4
|
|
||||||
dotenv:
|
dotenv:
|
||||||
specifier: ^16.3.1
|
specifier: ^16.3.1
|
||||||
version: 16.3.1
|
version: 16.3.1
|
||||||
@@ -437,7 +416,7 @@ packages:
|
|||||||
digest-fetch: 1.3.0
|
digest-fetch: 1.3.0
|
||||||
form-data-encoder: 1.7.2
|
form-data-encoder: 1.7.2
|
||||||
formdata-node: 4.4.1
|
formdata-node: 4.4.1
|
||||||
node-fetch: 2.6.12(encoding@0.1.13)
|
node-fetch: 2.6.12
|
||||||
transitivePeerDependencies:
|
transitivePeerDependencies:
|
||||||
- encoding
|
- encoding
|
||||||
dev: false
|
dev: false
|
||||||
@@ -2711,7 +2690,7 @@ packages:
|
|||||||
dependencies:
|
dependencies:
|
||||||
https-proxy-agent: 5.0.1
|
https-proxy-agent: 5.0.1
|
||||||
mkdirp: 0.5.6
|
mkdirp: 0.5.6
|
||||||
node-fetch: 2.6.12(encoding@0.1.13)
|
node-fetch: 2.6.12
|
||||||
progress: 2.0.3
|
progress: 2.0.3
|
||||||
proxy-from-env: 1.1.0
|
proxy-from-env: 1.1.0
|
||||||
which: 2.0.2
|
which: 2.0.2
|
||||||
@@ -2962,12 +2941,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-+Wt0NFAeflVSNiUnHIDNN3C8jP7XIRmYrcgJ6IsAnm0lK4p/FkpCpeu1aig5qxrgZx30PHNDLZ/3FttVSEW2aQ==}
|
resolution: {integrity: sha512-+Wt0NFAeflVSNiUnHIDNN3C8jP7XIRmYrcgJ6IsAnm0lK4p/FkpCpeu1aig5qxrgZx30PHNDLZ/3FttVSEW2aQ==}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/@types/archiver@5.3.2:
|
|
||||||
resolution: {integrity: sha512-IctHreBuWE5dvBDz/0WeKtyVKVRs4h75IblxOACL92wU66v+HGAfEYAOyXkOFphvRJMhuXdI9huDXpX0FC6lCw==}
|
|
||||||
dependencies:
|
|
||||||
'@types/readdir-glob': 1.1.1
|
|
||||||
dev: true
|
|
||||||
|
|
||||||
/@types/babel__core@7.20.1:
|
/@types/babel__core@7.20.1:
|
||||||
resolution: {integrity: sha512-aACu/U/omhdk15O4Nfb+fHgH/z3QsfQzpnvRZhYhThms83ZnAOZz7zZAWO7mn2yyNQaA4xTO8GLK3uqFU4bYYw==}
|
resolution: {integrity: sha512-aACu/U/omhdk15O4Nfb+fHgH/z3QsfQzpnvRZhYhThms83ZnAOZz7zZAWO7mn2yyNQaA4xTO8GLK3uqFU4bYYw==}
|
||||||
dependencies:
|
dependencies:
|
||||||
@@ -3207,6 +3180,7 @@ packages:
|
|||||||
dependencies:
|
dependencies:
|
||||||
'@types/node': 20.4.10
|
'@types/node': 20.4.10
|
||||||
form-data: 3.0.1
|
form-data: 3.0.1
|
||||||
|
dev: false
|
||||||
|
|
||||||
/@types/node@18.16.0:
|
/@types/node@18.16.0:
|
||||||
resolution: {integrity: sha512-BsAaKhB+7X+H4GnSjGhJG9Qi8Tw+inU9nJDwmD5CgOmBLEI6ArdhikpLX7DjbjDRDTbqZzU2LSQNZg8WGPiSZQ==}
|
resolution: {integrity: sha512-BsAaKhB+7X+H4GnSjGhJG9Qi8Tw+inU9nJDwmD5CgOmBLEI6ArdhikpLX7DjbjDRDTbqZzU2LSQNZg8WGPiSZQ==}
|
||||||
@@ -3283,12 +3257,6 @@ packages:
|
|||||||
'@types/scheduler': 0.16.3
|
'@types/scheduler': 0.16.3
|
||||||
csstype: 3.1.2
|
csstype: 3.1.2
|
||||||
|
|
||||||
/@types/readdir-glob@1.1.1:
|
|
||||||
resolution: {integrity: sha512-ImM6TmoF8bgOwvehGviEj3tRdRBbQujr1N+0ypaln/GWjaerOB26jb93vsRHmdMtvVQZQebOlqt2HROark87mQ==}
|
|
||||||
dependencies:
|
|
||||||
'@types/node': 20.4.10
|
|
||||||
dev: true
|
|
||||||
|
|
||||||
/@types/request@2.48.8:
|
/@types/request@2.48.8:
|
||||||
resolution: {integrity: sha512-whjk1EDJPcAR2kYHRbFl/lKeeKYTi05A15K9bnLInCVroNDCtXce57xKdI0/rQaA3K+6q0eFyUBPmqfSndUZdQ==}
|
resolution: {integrity: sha512-whjk1EDJPcAR2kYHRbFl/lKeeKYTi05A15K9bnLInCVroNDCtXce57xKdI0/rQaA3K+6q0eFyUBPmqfSndUZdQ==}
|
||||||
dependencies:
|
dependencies:
|
||||||
@@ -3320,12 +3288,6 @@ packages:
|
|||||||
'@types/node': 20.4.10
|
'@types/node': 20.4.10
|
||||||
dev: true
|
dev: true
|
||||||
|
|
||||||
/@types/stream-buffers@3.0.4:
|
|
||||||
resolution: {integrity: sha512-qU/K1tb2yUdhXkLIATzsIPwbtX6BpZk0l3dPW6xqWyhfzzM1ECaQ/8faEnu3CNraLiQ9LHyQQPBGp7N9Fbs25w==}
|
|
||||||
dependencies:
|
|
||||||
'@types/node': 20.4.10
|
|
||||||
dev: true
|
|
||||||
|
|
||||||
/@types/tough-cookie@4.0.2:
|
/@types/tough-cookie@4.0.2:
|
||||||
resolution: {integrity: sha512-Q5vtl1W5ue16D+nIaW8JWebSSraJVlK+EthKn7e7UcD4KWsaSJ8BqGPXNaPghgtcn/fhvrN17Tv8ksUsQpiplw==}
|
resolution: {integrity: sha512-Q5vtl1W5ue16D+nIaW8JWebSSraJVlK+EthKn7e7UcD4KWsaSJ8BqGPXNaPghgtcn/fhvrN17Tv8ksUsQpiplw==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -3729,51 +3691,6 @@ packages:
|
|||||||
picomatch: 2.3.1
|
picomatch: 2.3.1
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/archiver-utils@2.1.0:
|
|
||||||
resolution: {integrity: sha512-bEL/yUb/fNNiNTuUz979Z0Yg5L+LzLxGJz8x79lYmR54fmTIb6ob/hNQgkQnIUDWIFjZVQwl9Xs356I6BAMHfw==}
|
|
||||||
engines: {node: '>= 6'}
|
|
||||||
dependencies:
|
|
||||||
glob: 7.2.3
|
|
||||||
graceful-fs: 4.2.11
|
|
||||||
lazystream: 1.0.1
|
|
||||||
lodash.defaults: 4.2.0
|
|
||||||
lodash.difference: 4.5.0
|
|
||||||
lodash.flatten: 4.4.0
|
|
||||||
lodash.isplainobject: 4.0.6
|
|
||||||
lodash.union: 4.6.0
|
|
||||||
normalize-path: 3.0.0
|
|
||||||
readable-stream: 2.3.8
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/archiver-utils@3.0.3:
|
|
||||||
resolution: {integrity: sha512-fXzpEZTKgBJMWy0eUT0/332CAQnJ27OJd7sGcvNZzxS2Yzg7iITivMhXOm+zUTO4vT8ZqlPCqiaLPmB8qWhWRA==}
|
|
||||||
engines: {node: '>= 10'}
|
|
||||||
dependencies:
|
|
||||||
glob: 7.2.3
|
|
||||||
graceful-fs: 4.2.11
|
|
||||||
lazystream: 1.0.1
|
|
||||||
lodash.defaults: 4.2.0
|
|
||||||
lodash.difference: 4.5.0
|
|
||||||
lodash.flatten: 4.4.0
|
|
||||||
lodash.isplainobject: 4.0.6
|
|
||||||
lodash.union: 4.6.0
|
|
||||||
normalize-path: 3.0.0
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/archiver@6.0.0:
|
|
||||||
resolution: {integrity: sha512-EPGa+bYaxaMiCT8DCbEDqFz8IjeBSExrJzyUOJx2FBkFJ/OZzJuso3lMSk901M50gMqXxTQcumlGajOFlXhVhw==}
|
|
||||||
engines: {node: '>= 12.0.0'}
|
|
||||||
dependencies:
|
|
||||||
archiver-utils: 3.0.3
|
|
||||||
async: 3.2.4
|
|
||||||
buffer-crc32: 0.2.13
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
readdir-glob: 1.1.3
|
|
||||||
tar-stream: 2.2.0
|
|
||||||
zip-stream: 4.1.0
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/argparse@2.0.1:
|
/argparse@2.0.1:
|
||||||
resolution: {integrity: sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==}
|
resolution: {integrity: sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==}
|
||||||
|
|
||||||
@@ -3912,12 +3829,9 @@ packages:
|
|||||||
tslib: 2.6.1
|
tslib: 2.6.1
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/async@3.2.4:
|
|
||||||
resolution: {integrity: sha512-iAB+JbDEGXhyIUavoDl9WP/Jj106Kz9DEn1DPgYw5ruDn0e3Wgi3sKFm55sASdGBNOQB8F59d9qQ7deqrHA8wQ==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/asynckit@0.4.0:
|
/asynckit@0.4.0:
|
||||||
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
||||||
|
dev: false
|
||||||
|
|
||||||
/available-typed-arrays@1.0.5:
|
/available-typed-arrays@1.0.5:
|
||||||
resolution: {integrity: sha512-DMD0KiN46eipeziST1LPP/STfDU0sufISXmjSgvVsoU2tqxctQeASejWcfNtxYKqETM1UxQ8sp2OrSBWpHY6sw==}
|
resolution: {integrity: sha512-DMD0KiN46eipeziST1LPP/STfDU0sufISXmjSgvVsoU2tqxctQeASejWcfNtxYKqETM1UxQ8sp2OrSBWpHY6sw==}
|
||||||
@@ -4035,14 +3949,6 @@ packages:
|
|||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/bl@4.1.0:
|
|
||||||
resolution: {integrity: sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==}
|
|
||||||
dependencies:
|
|
||||||
buffer: 5.7.1
|
|
||||||
inherits: 2.0.4
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/bluebird@3.7.2:
|
/bluebird@3.7.2:
|
||||||
resolution: {integrity: sha512-XpNj6GDQzdfW+r2Wnn7xiSAd7TM3jzkxGXBGTtWKuSXv1xUV+azxAm8jdWZN06QTQk+2N2XB9jRDkvbmQmcRtg==}
|
resolution: {integrity: sha512-XpNj6GDQzdfW+r2Wnn7xiSAd7TM3jzkxGXBGTtWKuSXv1xUV+azxAm8jdWZN06QTQk+2N2XB9jRDkvbmQmcRtg==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -4095,10 +4001,6 @@ packages:
|
|||||||
node-releases: 2.0.13
|
node-releases: 2.0.13
|
||||||
update-browserslist-db: 1.0.11(browserslist@4.21.10)
|
update-browserslist-db: 1.0.11(browserslist@4.21.10)
|
||||||
|
|
||||||
/buffer-crc32@0.2.13:
|
|
||||||
resolution: {integrity: sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/buffer-from@0.1.2:
|
/buffer-from@0.1.2:
|
||||||
resolution: {integrity: sha512-RiWIenusJsmI2KcvqQABB83tLxCByE3upSP8QU3rJDMVFGPWLvPQJt/O1Su9moRWeH7d+Q2HYb68f6+v+tw2vg==}
|
resolution: {integrity: sha512-RiWIenusJsmI2KcvqQABB83tLxCByE3upSP8QU3rJDMVFGPWLvPQJt/O1Su9moRWeH7d+Q2HYb68f6+v+tw2vg==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -4111,13 +4013,6 @@ packages:
|
|||||||
engines: {node: '>=4'}
|
engines: {node: '>=4'}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/buffer@5.7.1:
|
|
||||||
resolution: {integrity: sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==}
|
|
||||||
dependencies:
|
|
||||||
base64-js: 1.5.1
|
|
||||||
ieee754: 1.2.1
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/busboy@1.6.0:
|
/busboy@1.6.0:
|
||||||
resolution: {integrity: sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==}
|
resolution: {integrity: sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==}
|
||||||
engines: {node: '>=10.16.0'}
|
engines: {node: '>=10.16.0'}
|
||||||
@@ -4327,6 +4222,7 @@ packages:
|
|||||||
engines: {node: '>= 0.8'}
|
engines: {node: '>= 0.8'}
|
||||||
dependencies:
|
dependencies:
|
||||||
delayed-stream: 1.0.0
|
delayed-stream: 1.0.0
|
||||||
|
dev: false
|
||||||
|
|
||||||
/comma-separated-tokens@1.0.8:
|
/comma-separated-tokens@1.0.8:
|
||||||
resolution: {integrity: sha512-GHuDRO12Sypu2cV70d1dkA2EUmXHgntrzbpvOB+Qy+49ypNfGgFQIC2fhhXbnyrJRynDCAARsT7Ou0M6hirpfw==}
|
resolution: {integrity: sha512-GHuDRO12Sypu2cV70d1dkA2EUmXHgntrzbpvOB+Qy+49ypNfGgFQIC2fhhXbnyrJRynDCAARsT7Ou0M6hirpfw==}
|
||||||
@@ -4344,16 +4240,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-W9pAhw0ja1Edb5GVdIF1mjZw/ASI0AlShXM83UUGe2DVr5TdAPEA1OA8m/g8zWp9x6On7gqufY+FatDbC3MDQg==}
|
resolution: {integrity: sha512-W9pAhw0ja1Edb5GVdIF1mjZw/ASI0AlShXM83UUGe2DVr5TdAPEA1OA8m/g8zWp9x6On7gqufY+FatDbC3MDQg==}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/compress-commons@4.1.1:
|
|
||||||
resolution: {integrity: sha512-QLdDLCKNV2dtoTorqgxngQCMA+gWXkM/Nwu7FpeBhk/RdkzimqC3jueb/FDmaZeXh+uby1jkBqE3xArsLBE5wQ==}
|
|
||||||
engines: {node: '>= 10'}
|
|
||||||
dependencies:
|
|
||||||
buffer-crc32: 0.2.13
|
|
||||||
crc32-stream: 4.0.2
|
|
||||||
normalize-path: 3.0.0
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/compute-scroll-into-view@1.0.20:
|
/compute-scroll-into-view@1.0.20:
|
||||||
resolution: {integrity: sha512-UCB0ioiyj8CRjtrvaceBLqqhZCVP+1B8+NWQhmdsm0VXOJtobBCf1dBQmebCCo34qZmUwZfIH2MZLqNHazrfjg==}
|
resolution: {integrity: sha512-UCB0ioiyj8CRjtrvaceBLqqhZCVP+1B8+NWQhmdsm0VXOJtobBCf1dBQmebCCo34qZmUwZfIH2MZLqNHazrfjg==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -4461,20 +4347,6 @@ packages:
|
|||||||
yaml: 1.10.2
|
yaml: 1.10.2
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/crc-32@1.2.2:
|
|
||||||
resolution: {integrity: sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==}
|
|
||||||
engines: {node: '>=0.8'}
|
|
||||||
hasBin: true
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/crc32-stream@4.0.2:
|
|
||||||
resolution: {integrity: sha512-DxFZ/Hk473b/muq1VJ///PMNLj0ZMnzye9thBpmjpJKCc5eMgB95aK8zCGrGfQ90cWo561Te6HK9D+j4KPdM6w==}
|
|
||||||
engines: {node: '>= 10'}
|
|
||||||
dependencies:
|
|
||||||
crc-32: 1.2.2
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/create-emotion@10.0.27:
|
/create-emotion@10.0.27:
|
||||||
resolution: {integrity: sha512-fIK73w82HPPn/RsAij7+Zt8eCE8SptcJ3WoRMfxMtjteYxud8GDTKKld7MYwAX2TVhrw29uR1N/bVGxeStHILg==}
|
resolution: {integrity: sha512-fIK73w82HPPn/RsAij7+Zt8eCE8SptcJ3WoRMfxMtjteYxud8GDTKKld7MYwAX2TVhrw29uR1N/bVGxeStHILg==}
|
||||||
dependencies:
|
dependencies:
|
||||||
@@ -4635,6 +4507,11 @@ packages:
|
|||||||
assert-plus: 1.0.0
|
assert-plus: 1.0.0
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
|
/data-uri-to-buffer@4.0.1:
|
||||||
|
resolution: {integrity: sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==}
|
||||||
|
engines: {node: '>= 12'}
|
||||||
|
dev: false
|
||||||
|
|
||||||
/date-fns@2.30.0:
|
/date-fns@2.30.0:
|
||||||
resolution: {integrity: sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==}
|
resolution: {integrity: sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==}
|
||||||
engines: {node: '>=0.11'}
|
engines: {node: '>=0.11'}
|
||||||
@@ -4718,6 +4595,7 @@ packages:
|
|||||||
/delayed-stream@1.0.0:
|
/delayed-stream@1.0.0:
|
||||||
resolution: {integrity: sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==}
|
resolution: {integrity: sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==}
|
||||||
engines: {node: '>=0.4.0'}
|
engines: {node: '>=0.4.0'}
|
||||||
|
dev: false
|
||||||
|
|
||||||
/depd@1.1.2:
|
/depd@1.1.2:
|
||||||
resolution: {integrity: sha512-7emPTl6Dpo6JRXOXjLRxck+FlLRX5847cLKEn00PLAgc3g2hTZZgr+e4c2v6QpSmLeFP3n5yUo7ft6avBK/5jQ==}
|
resolution: {integrity: sha512-7emPTl6Dpo6JRXOXjLRxck+FlLRX5847cLKEn00PLAgc3g2hTZZgr+e4c2v6QpSmLeFP3n5yUo7ft6avBK/5jQ==}
|
||||||
@@ -4851,18 +4729,6 @@ packages:
|
|||||||
engines: {node: '>= 0.8'}
|
engines: {node: '>= 0.8'}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/encoding@0.1.13:
|
|
||||||
resolution: {integrity: sha512-ETBauow1T35Y/WZMkio9jiM0Z5xjHHmJ4XmjZOq1l/dXz3lr2sRn87nJy20RupqSh1F2m3HHPSp8ShIPQJrJ3A==}
|
|
||||||
dependencies:
|
|
||||||
iconv-lite: 0.6.3
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/end-of-stream@1.4.4:
|
|
||||||
resolution: {integrity: sha512-+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q==}
|
|
||||||
dependencies:
|
|
||||||
once: 1.4.0
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/engine.io-client@6.5.2:
|
/engine.io-client@6.5.2:
|
||||||
resolution: {integrity: sha512-CQZqbrpEYnrpGqC07a9dJDz4gePZUgTPMU3NKJPSeQOyw27Tst4Pl3FemKoFGAlHzgZmKjoRmiJvbWfhCXUlIg==}
|
resolution: {integrity: sha512-CQZqbrpEYnrpGqC07a9dJDz4gePZUgTPMU3NKJPSeQOyw27Tst4Pl3FemKoFGAlHzgZmKjoRmiJvbWfhCXUlIg==}
|
||||||
dependencies:
|
dependencies:
|
||||||
@@ -5533,6 +5399,14 @@ packages:
|
|||||||
format: 0.2.2
|
format: 0.2.2
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
|
/fetch-blob@3.2.0:
|
||||||
|
resolution: {integrity: sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==}
|
||||||
|
engines: {node: ^12.20 || >= 14.13}
|
||||||
|
dependencies:
|
||||||
|
node-domexception: 1.0.0
|
||||||
|
web-streams-polyfill: 3.2.1
|
||||||
|
dev: false
|
||||||
|
|
||||||
/fflate@0.4.8:
|
/fflate@0.4.8:
|
||||||
resolution: {integrity: sha512-FJqqoDBR00Mdj9ppamLa/Y7vxm+PRmNWA67N846RvsoYVMKB4q3y/de5PA7gUmRMYK/8CMz2GDZQmCRN1wBcWA==}
|
resolution: {integrity: sha512-FJqqoDBR00Mdj9ppamLa/Y7vxm+PRmNWA67N846RvsoYVMKB4q3y/de5PA7gUmRMYK/8CMz2GDZQmCRN1wBcWA==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -5648,6 +5522,7 @@ packages:
|
|||||||
asynckit: 0.4.0
|
asynckit: 0.4.0
|
||||||
combined-stream: 1.0.8
|
combined-stream: 1.0.8
|
||||||
mime-types: 2.1.35
|
mime-types: 2.1.35
|
||||||
|
dev: false
|
||||||
|
|
||||||
/form-data@4.0.0:
|
/form-data@4.0.0:
|
||||||
resolution: {integrity: sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==}
|
resolution: {integrity: sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==}
|
||||||
@@ -5671,6 +5546,13 @@ packages:
|
|||||||
web-streams-polyfill: 4.0.0-beta.3
|
web-streams-polyfill: 4.0.0-beta.3
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
|
/formdata-polyfill@4.0.10:
|
||||||
|
resolution: {integrity: sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==}
|
||||||
|
engines: {node: '>=12.20.0'}
|
||||||
|
dependencies:
|
||||||
|
fetch-blob: 3.2.0
|
||||||
|
dev: false
|
||||||
|
|
||||||
/forwarded@0.2.0:
|
/forwarded@0.2.0:
|
||||||
resolution: {integrity: sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==}
|
resolution: {integrity: sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==}
|
||||||
engines: {node: '>= 0.6'}
|
engines: {node: '>= 0.6'}
|
||||||
@@ -5705,10 +5587,6 @@ packages:
|
|||||||
engines: {node: '>= 0.6'}
|
engines: {node: '>= 0.6'}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/fs-constants@1.0.0:
|
|
||||||
resolution: {integrity: sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/fs-extra@11.1.1:
|
/fs-extra@11.1.1:
|
||||||
resolution: {integrity: sha512-MGIE4HOvQCeUCzmlHs0vXpih4ysz4wg9qiSAu6cd42lVwPbTM1TjV7RusoyQqMmk/95gdQZX72u+YW+c3eEpFQ==}
|
resolution: {integrity: sha512-MGIE4HOvQCeUCzmlHs0vXpih4ysz4wg9qiSAu6cd42lVwPbTM1TjV7RusoyQqMmk/95gdQZX72u+YW+c3eEpFQ==}
|
||||||
engines: {node: '>=14.14'}
|
engines: {node: '>=14.14'}
|
||||||
@@ -6077,10 +5955,6 @@ packages:
|
|||||||
- supports-color
|
- supports-color
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/human-id@4.0.0:
|
|
||||||
resolution: {integrity: sha512-pui0xZRgeAlaRt0I9r8N2pNlbNmluvn71EfjKRpM7jOpZbuHe5mm76r67gcprjw/Nd+GpvB9C3OlTbh7ZKLg7A==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/humanize-ms@1.2.1:
|
/humanize-ms@1.2.1:
|
||||||
resolution: {integrity: sha512-Fl70vYtsAFb/C06PTS9dZBo7ihau+Tu/DNCk/OyHhea07S+aeMWpFFkUaXRa8fI+ScZbEI8dfSxwY7gxZ9SAVQ==}
|
resolution: {integrity: sha512-Fl70vYtsAFb/C06PTS9dZBo7ihau+Tu/DNCk/OyHhea07S+aeMWpFFkUaXRa8fI+ScZbEI8dfSxwY7gxZ9SAVQ==}
|
||||||
dependencies:
|
dependencies:
|
||||||
@@ -6094,17 +5968,6 @@ packages:
|
|||||||
safer-buffer: 2.1.2
|
safer-buffer: 2.1.2
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/iconv-lite@0.6.3:
|
|
||||||
resolution: {integrity: sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==}
|
|
||||||
engines: {node: '>=0.10.0'}
|
|
||||||
dependencies:
|
|
||||||
safer-buffer: 2.1.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/ieee754@1.2.1:
|
|
||||||
resolution: {integrity: sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/ignore@5.2.4:
|
/ignore@5.2.4:
|
||||||
resolution: {integrity: sha512-MAb38BcSbH0eHNBxn7ql2NH/kX33OkB3lZ1BNdh7ENeRChHTYsTvWrMubiIAMNS2llXEEgZ1MUOBtXChP3kaFQ==}
|
resolution: {integrity: sha512-MAb38BcSbH0eHNBxn7ql2NH/kX33OkB3lZ1BNdh7ENeRChHTYsTvWrMubiIAMNS2llXEEgZ1MUOBtXChP3kaFQ==}
|
||||||
engines: {node: '>= 4'}
|
engines: {node: '>= 4'}
|
||||||
@@ -6396,7 +6259,7 @@ packages:
|
|||||||
resolution: {integrity: sha512-7vuh85V5cdDofPyxn58nrPjBktZo0u9x1g8WtjQol+jZDaE+fhN+cIvTj11GndBnMnyfrUOG1sZQxCdjKh+DKg==}
|
resolution: {integrity: sha512-7vuh85V5cdDofPyxn58nrPjBktZo0u9x1g8WtjQol+jZDaE+fhN+cIvTj11GndBnMnyfrUOG1sZQxCdjKh+DKg==}
|
||||||
engines: {node: '>= 10.13.0'}
|
engines: {node: '>= 10.13.0'}
|
||||||
dependencies:
|
dependencies:
|
||||||
'@types/node': 20.4.10
|
'@types/node': 18.16.0
|
||||||
merge-stream: 2.0.0
|
merge-stream: 2.0.0
|
||||||
supports-color: 8.1.1
|
supports-color: 8.1.1
|
||||||
|
|
||||||
@@ -6570,13 +6433,6 @@ packages:
|
|||||||
language-subtag-registry: 0.3.22
|
language-subtag-registry: 0.3.22
|
||||||
dev: true
|
dev: true
|
||||||
|
|
||||||
/lazystream@1.0.1:
|
|
||||||
resolution: {integrity: sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==}
|
|
||||||
engines: {node: '>= 0.6.3'}
|
|
||||||
dependencies:
|
|
||||||
readable-stream: 2.3.8
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/levn@0.4.1:
|
/levn@0.4.1:
|
||||||
resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==}
|
resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==}
|
||||||
engines: {node: '>= 0.8.0'}
|
engines: {node: '>= 0.8.0'}
|
||||||
@@ -6645,22 +6501,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-/u14pXGviLaweY5JI0IUzgzF2J6Ne8INyzAZjImcryjgkZ+ebruBxy2/JaOOkTqScddcYtakjhSaeemV8lR0tA==}
|
resolution: {integrity: sha512-/u14pXGviLaweY5JI0IUzgzF2J6Ne8INyzAZjImcryjgkZ+ebruBxy2/JaOOkTqScddcYtakjhSaeemV8lR0tA==}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/lodash.defaults@4.2.0:
|
|
||||||
resolution: {integrity: sha512-qjxPLHd3r5DnsdGacqOMU6pb/avJzdh9tFX2ymgoZE27BmjXrNy/y4LoaiTeAb+O3gL8AfpJGtqfX/ae2leYYQ==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/lodash.difference@4.5.0:
|
|
||||||
resolution: {integrity: sha512-dS2j+W26TQ7taQBGN8Lbbq04ssV3emRw4NY58WErlTO29pIqS0HmoT5aJ9+TUQ1N3G+JOZSji4eugsWwGp9yPA==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/lodash.flatten@4.4.0:
|
|
||||||
resolution: {integrity: sha512-C5N2Z3DgnnKr0LOpv/hKCgKdb7ZZwafIrsesve6lmzvZIRZRGaZ/l6Q8+2W7NaT+ZwO3fFlSCzCzrDCFdJfZ4g==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/lodash.isplainobject@4.0.6:
|
|
||||||
resolution: {integrity: sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/lodash.merge@4.6.2:
|
/lodash.merge@4.6.2:
|
||||||
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
||||||
dev: true
|
dev: true
|
||||||
@@ -6669,10 +6509,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-GK3g5RPZWTRSeLSpgP8Xhra+pnjBC56q9FZYe1d5RN3TJ35dbkGy3YqBSMbyCrlbi+CM9Z3Jk5yTL7RCsqboyQ==}
|
resolution: {integrity: sha512-GK3g5RPZWTRSeLSpgP8Xhra+pnjBC56q9FZYe1d5RN3TJ35dbkGy3YqBSMbyCrlbi+CM9Z3Jk5yTL7RCsqboyQ==}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/lodash.union@4.6.0:
|
|
||||||
resolution: {integrity: sha512-c4pB2CdGrGdjMKYLA+XiRDO7Y0PRQbm/Gzg8qMj+QH+pFVAoTp5sBpO0odL3FjoPCGjK96p6qsP+yQoiLoOBcw==}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/lodash@4.17.21:
|
/lodash@4.17.21:
|
||||||
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
||||||
dev: false
|
dev: false
|
||||||
@@ -7023,7 +6859,7 @@ packages:
|
|||||||
engines: {node: '>=10.5.0'}
|
engines: {node: '>=10.5.0'}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/node-fetch@2.6.12(encoding@0.1.13):
|
/node-fetch@2.6.12:
|
||||||
resolution: {integrity: sha512-C/fGU2E8ToujUivIO0H+tpQ6HWo4eEmchoPIoXtxCrVghxdKq+QOHqEZW7tuP3KlV3bC8FRMO5nMCC7Zm1VP6g==}
|
resolution: {integrity: sha512-C/fGU2E8ToujUivIO0H+tpQ6HWo4eEmchoPIoXtxCrVghxdKq+QOHqEZW7tuP3KlV3bC8FRMO5nMCC7Zm1VP6g==}
|
||||||
engines: {node: 4.x || >=6.0.0}
|
engines: {node: 4.x || >=6.0.0}
|
||||||
peerDependencies:
|
peerDependencies:
|
||||||
@@ -7032,10 +6868,18 @@ packages:
|
|||||||
encoding:
|
encoding:
|
||||||
optional: true
|
optional: true
|
||||||
dependencies:
|
dependencies:
|
||||||
encoding: 0.1.13
|
|
||||||
whatwg-url: 5.0.0
|
whatwg-url: 5.0.0
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
|
/node-fetch@3.3.2:
|
||||||
|
resolution: {integrity: sha512-dRB78srN/l6gqWulah9SrxeYnxeddIG30+GOqK/9OlLVyLg3HPnr6SqOWTWOXKRwC2eGYCkZ59NNuSgvSrpgOA==}
|
||||||
|
engines: {node: ^12.20.0 || ^14.13.1 || >=16.0.0}
|
||||||
|
dependencies:
|
||||||
|
data-uri-to-buffer: 4.0.1
|
||||||
|
fetch-blob: 3.2.0
|
||||||
|
formdata-polyfill: 4.0.10
|
||||||
|
dev: false
|
||||||
|
|
||||||
/node-mocks-http@1.12.2:
|
/node-mocks-http@1.12.2:
|
||||||
resolution: {integrity: sha512-xhWwC0dh35R9rf0j3bRZXuISXdHxxtMx0ywZQBwjrg3yl7KpRETzogfeCamUIjltpn0Fxvs/ZhGJul1vPLrdJQ==}
|
resolution: {integrity: sha512-xhWwC0dh35R9rf0j3bRZXuISXdHxxtMx0ywZQBwjrg3yl7KpRETzogfeCamUIjltpn0Fxvs/ZhGJul1vPLrdJQ==}
|
||||||
engines: {node: '>=0.6'}
|
engines: {node: '>=0.6'}
|
||||||
@@ -7183,7 +7027,7 @@ packages:
|
|||||||
- debug
|
- debug
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/openai@4.0.0-beta.7(encoding@0.1.13):
|
/openai@4.0.0-beta.7:
|
||||||
resolution: {integrity: sha512-jHjwvpMuGkNxiQ3erwLZsOvPEhcVrMtwtfNeYmGCjhbdB+oStVw/7pIhIPkualu8rlhLwgMR7awknIaN3IQcOA==}
|
resolution: {integrity: sha512-jHjwvpMuGkNxiQ3erwLZsOvPEhcVrMtwtfNeYmGCjhbdB+oStVw/7pIhIPkualu8rlhLwgMR7awknIaN3IQcOA==}
|
||||||
dependencies:
|
dependencies:
|
||||||
'@types/node': 18.16.0
|
'@types/node': 18.16.0
|
||||||
@@ -7193,7 +7037,7 @@ packages:
|
|||||||
digest-fetch: 1.3.0
|
digest-fetch: 1.3.0
|
||||||
form-data-encoder: 1.7.2
|
form-data-encoder: 1.7.2
|
||||||
formdata-node: 4.4.1
|
formdata-node: 4.4.1
|
||||||
node-fetch: 2.6.12(encoding@0.1.13)
|
node-fetch: 2.6.12
|
||||||
transitivePeerDependencies:
|
transitivePeerDependencies:
|
||||||
- encoding
|
- encoding
|
||||||
dev: false
|
dev: false
|
||||||
@@ -8018,21 +7862,6 @@ packages:
|
|||||||
util-deprecate: 1.0.2
|
util-deprecate: 1.0.2
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/readable-stream@3.6.2:
|
|
||||||
resolution: {integrity: sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==}
|
|
||||||
engines: {node: '>= 6'}
|
|
||||||
dependencies:
|
|
||||||
inherits: 2.0.4
|
|
||||||
string_decoder: 1.1.1
|
|
||||||
util-deprecate: 1.0.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/readdir-glob@1.1.3:
|
|
||||||
resolution: {integrity: sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA==}
|
|
||||||
dependencies:
|
|
||||||
minimatch: 5.1.6
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/readdirp@3.6.0:
|
/readdirp@3.6.0:
|
||||||
resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==}
|
resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==}
|
||||||
engines: {node: '>=8.10.0'}
|
engines: {node: '>=8.10.0'}
|
||||||
@@ -8488,11 +8317,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-Rz6yejtVyWnVjC1RFvNmYL10kgjC49EOghxWn0RFqlCHGFpQx+Xe7yW3I4ceK1SGrWIGMjD5Kbue8W/udkbMJg==}
|
resolution: {integrity: sha512-Rz6yejtVyWnVjC1RFvNmYL10kgjC49EOghxWn0RFqlCHGFpQx+Xe7yW3I4ceK1SGrWIGMjD5Kbue8W/udkbMJg==}
|
||||||
dev: true
|
dev: true
|
||||||
|
|
||||||
/stream-buffers@3.0.2:
|
|
||||||
resolution: {integrity: sha512-DQi1h8VEBA/lURbSwFtEHnSTb9s2/pwLEaFuNhXwy1Dx3Sa0lOuYT2yNUr4/j2fs8oCAMANtrZ5OrPZtyVs3MQ==}
|
|
||||||
engines: {node: '>= 0.10.0'}
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/streamsearch@1.1.0:
|
/streamsearch@1.1.0:
|
||||||
resolution: {integrity: sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==}
|
resolution: {integrity: sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==}
|
||||||
engines: {node: '>=10.0.0'}
|
engines: {node: '>=10.0.0'}
|
||||||
@@ -8640,17 +8464,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-GNzQvQTOIP6RyTfE2Qxb8ZVlNmw0n88vp1szwWRimP02mnTsx3Wtn5qRdqY9w2XduFNUgvOwhNnQsjwCp+kqaQ==}
|
resolution: {integrity: sha512-GNzQvQTOIP6RyTfE2Qxb8ZVlNmw0n88vp1szwWRimP02mnTsx3Wtn5qRdqY9w2XduFNUgvOwhNnQsjwCp+kqaQ==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
|
|
||||||
/tar-stream@2.2.0:
|
|
||||||
resolution: {integrity: sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ==}
|
|
||||||
engines: {node: '>=6'}
|
|
||||||
dependencies:
|
|
||||||
bl: 4.1.0
|
|
||||||
end-of-stream: 1.4.4
|
|
||||||
fs-constants: 1.0.0
|
|
||||||
inherits: 2.0.4
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/terser-webpack-plugin@5.3.9(webpack@5.88.2):
|
/terser-webpack-plugin@5.3.9(webpack@5.88.2):
|
||||||
resolution: {integrity: sha512-ZuXsqE07EcggTWQjXUj+Aot/OMcD0bMKGgF63f7UxYcu5/AJF53aIpK1YoP5xR9l6s/Hy2b+t1AM0bLNPRuhwA==}
|
resolution: {integrity: sha512-ZuXsqE07EcggTWQjXUj+Aot/OMcD0bMKGgF63f7UxYcu5/AJF53aIpK1YoP5xR9l6s/Hy2b+t1AM0bLNPRuhwA==}
|
||||||
engines: {node: '>= 10.13.0'}
|
engines: {node: '>= 10.13.0'}
|
||||||
@@ -9324,6 +9137,11 @@ packages:
|
|||||||
glob-to-regexp: 0.4.1
|
glob-to-regexp: 0.4.1
|
||||||
graceful-fs: 4.2.11
|
graceful-fs: 4.2.11
|
||||||
|
|
||||||
|
/web-streams-polyfill@3.2.1:
|
||||||
|
resolution: {integrity: sha512-e0MO3wdXWKrLbL0DgGnUV7WHVuw9OUvL4hjgnPkIeEvESk74gAITi5G606JtZPp39cd8HA9VQzCIvA49LpPN5Q==}
|
||||||
|
engines: {node: '>= 8'}
|
||||||
|
dev: false
|
||||||
|
|
||||||
/web-streams-polyfill@4.0.0-beta.3:
|
/web-streams-polyfill@4.0.0-beta.3:
|
||||||
resolution: {integrity: sha512-QW95TCTaHmsYfHDybGMwO5IJIM93I/6vTRk+daHTWFPhwh+C8Cg7j7XyKrwrj8Ib6vYXe0ocYNrmzY4xAAN6ug==}
|
resolution: {integrity: sha512-QW95TCTaHmsYfHDybGMwO5IJIM93I/6vTRk+daHTWFPhwh+C8Cg7j7XyKrwrj8Ib6vYXe0ocYNrmzY4xAAN6ug==}
|
||||||
engines: {node: '>= 14'}
|
engines: {node: '>= 14'}
|
||||||
@@ -9535,15 +9353,6 @@ packages:
|
|||||||
resolution: {integrity: sha512-N+d4UJSJbt/R3wqY7Coqs5pcV0aUj2j9IaQ3rNj9bVCLld8tTGKRa2USARjnvZJWVx1NDmQev8EknoczaOQDOA==}
|
resolution: {integrity: sha512-N+d4UJSJbt/R3wqY7Coqs5pcV0aUj2j9IaQ3rNj9bVCLld8tTGKRa2USARjnvZJWVx1NDmQev8EknoczaOQDOA==}
|
||||||
dev: false
|
dev: false
|
||||||
|
|
||||||
/zip-stream@4.1.0:
|
|
||||||
resolution: {integrity: sha512-zshzwQW7gG7hjpBlgeQP9RuyPGNxvJdzR8SUM3QhxCnLjWN2E7j3dOvpeDcQoETfHx0urRS7EtmVToql7YpU4A==}
|
|
||||||
engines: {node: '>= 10'}
|
|
||||||
dependencies:
|
|
||||||
archiver-utils: 2.1.0
|
|
||||||
compress-commons: 4.1.1
|
|
||||||
readable-stream: 3.6.2
|
|
||||||
dev: false
|
|
||||||
|
|
||||||
/zod-to-json-schema@3.21.4(zod@3.21.4):
|
/zod-to-json-schema@3.21.4(zod@3.21.4):
|
||||||
resolution: {integrity: sha512-fjUZh4nQ1s6HMccgIeE0VP4QG/YRGPmyjO9sAh890aQKPEk3nqbfUXhMFaC+Dr5KvYBm8BCyvfpZf2jY9aGSsw==}
|
resolution: {integrity: sha512-fjUZh4nQ1s6HMccgIeE0VP4QG/YRGPmyjO9sAh890aQKPEk3nqbfUXhMFaC+Dr5KvYBm8BCyvfpZf2jY9aGSsw==}
|
||||||
peerDependencies:
|
peerDependencies:
|
||||||
|
|||||||
26
render.yaml
26
render.yaml
@@ -2,23 +2,27 @@ databases:
|
|||||||
- name: querykey-prod
|
- name: querykey-prod
|
||||||
databaseName: querykey_prod
|
databaseName: querykey_prod
|
||||||
user: querykey
|
user: querykey
|
||||||
plan: standard
|
plan: starter
|
||||||
|
|
||||||
services:
|
services:
|
||||||
- type: web
|
- type: web
|
||||||
name: querykey-prod-web
|
name: querykey-prod-web
|
||||||
runtime: docker
|
env: docker
|
||||||
dockerfilePath: ./app/Dockerfile
|
dockerfilePath: ./app/Dockerfile
|
||||||
dockerContext: .
|
dockerContext: .
|
||||||
plan: pro
|
plan: standard
|
||||||
domains:
|
domains:
|
||||||
- app.openpipe.ai
|
- app.openpipe.ai
|
||||||
envVars:
|
envVars:
|
||||||
|
- key: NODE_ENV
|
||||||
|
value: production
|
||||||
- key: DATABASE_URL
|
- key: DATABASE_URL
|
||||||
fromDatabase:
|
fromDatabase:
|
||||||
name: querykey-prod
|
name: querykey-prod
|
||||||
property: connectionString
|
property: connectionString
|
||||||
- fromGroup: querykey-prod
|
- fromGroup: querykey-prod
|
||||||
|
- key: NEXT_PUBLIC_SOCKET_URL
|
||||||
|
value: https://querykey-prod-wss.onrender.com
|
||||||
# Render support says we need to manually set this because otherwise
|
# Render support says we need to manually set this because otherwise
|
||||||
# sometimes it checks a different random port that NextJS opens for
|
# sometimes it checks a different random port that NextJS opens for
|
||||||
# liveness and the liveness check fails.
|
# liveness and the liveness check fails.
|
||||||
@@ -27,22 +31,8 @@ services:
|
|||||||
|
|
||||||
- type: web
|
- type: web
|
||||||
name: querykey-prod-wss
|
name: querykey-prod-wss
|
||||||
runtime: docker
|
env: docker
|
||||||
dockerfilePath: ./app/Dockerfile
|
dockerfilePath: ./app/Dockerfile
|
||||||
dockerContext: .
|
dockerContext: .
|
||||||
plan: free
|
plan: free
|
||||||
dockerCommand: pnpm tsx src/wss-server.ts
|
dockerCommand: pnpm tsx src/wss-server.ts
|
||||||
|
|
||||||
- type: worker
|
|
||||||
name: querykey-prod-worker
|
|
||||||
runtime: docker
|
|
||||||
dockerfilePath: ./app/Dockerfile
|
|
||||||
dockerContext: .
|
|
||||||
plan: pro
|
|
||||||
dockerCommand: /code/app/scripts/run-workers-prod.sh
|
|
||||||
envVars:
|
|
||||||
- key: DATABASE_URL
|
|
||||||
fromDatabase:
|
|
||||||
name: querykey-prod
|
|
||||||
property: connectionString
|
|
||||||
- fromGroup: querykey-prod
|
|
||||||
|
|||||||
Reference in New Issue
Block a user