Compare commits

..

64 Commits

Author SHA1 Message Date
David Corbitt
73b9e40ced Give LoggedCallsTable scrollbar 2023-08-15 03:12:59 -07:00
David Corbitt
3447e863cc Prevent model name from wrapping 2023-08-15 02:53:24 -07:00
David Corbitt
897e77b054 Prevent logged calls table flashes 2023-08-15 02:49:46 -07:00
David Corbitt
b22a4cd93b Combine migrations 2023-08-15 02:34:27 -07:00
David Corbitt
3547c85c86 Display tag values 2023-08-15 02:32:05 -07:00
David Corbitt
9636fa033e Add second tag to seed 2023-08-15 02:31:24 -07:00
David Corbitt
890a738568 Filter by tags 2023-08-15 01:50:48 -07:00
David Corbitt
7003595e76 Install lodash-es in client-libs for omit function 2023-08-15 00:59:23 -07:00
David Corbitt
00df4453d3 Remove old prettier files 2023-08-15 00:55:05 -07:00
David Corbitt
4c325fc1cc Move prettier files to top directory 2023-08-15 00:54:52 -07:00
David Corbitt
dfee8a0ed7 Merge branch 'main' into log-filters 2023-08-15 00:41:28 -07:00
David Corbitt
0b4e116783 Undo changes in client-libs 2023-08-15 00:30:35 -07:00
David Corbitt
2bcb1d16a3 Autoresize InputDropdown 2023-08-15 00:27:12 -07:00
David Corbitt
6e7efee21e Seed with tags 2023-08-15 00:26:11 -07:00
David Corbitt
bb9c3a9e61 Condense table 2023-08-15 00:26:05 -07:00
David Corbitt
11bfb5d5e4 Start server with timezone 2023-08-14 23:37:23 -07:00
Kyle Corbitt
b00ab933b3 Merge pull request #157 from OpenPipe/more-js-api
TypeScript SDK mostly working
2023-08-14 23:25:33 -07:00
Kyle Corbitt
8f4e7f7e2e TypeScript SDK mostly working
Ok so this is still pretty rough, and notably there's no reporting for streaming. But for non-streaming requests I've verified that this does in fact report requests locally.
2023-08-14 23:22:27 -07:00
David Corbitt
634739c045 Add InputDropdown 2023-08-14 23:02:08 -07:00
David Corbitt
9a9cbe8fd4 Hide paginators for empty lists 2023-08-14 21:17:03 -07:00
David Corbitt
649dc3376b Debounce filter value updates 2023-08-14 21:00:42 -07:00
David Corbitt
05e774d021 Style filters title 2023-08-14 20:47:18 -07:00
David Corbitt
0e328b13dc Style add filter button 2023-08-14 20:42:51 -07:00
David Corbitt
0a18ca9cd6 Allow filtering by response, model, and status code 2023-08-14 20:16:44 -07:00
David Corbitt
a5fe35912e Allow filter by request contains 2023-08-14 20:01:17 -07:00
David Corbitt
3d3ddbe7a9 Show number of rows in table header 2023-08-14 19:56:15 -07:00
David Corbitt
d8a5617dee Increase button radius 2023-08-14 19:51:06 -07:00
Kyle Corbitt
5da62fdc29 Merge pull request #156 from OpenPipe/move-api
Python package improvements
2023-08-14 19:45:14 -07:00
Kyle Corbitt
754e273049 Python package improvements
Added an endpoint for getting the actual stored responses, and used it to test and improve the python package.
2023-08-14 19:07:03 -07:00
Kyle Corbitt
2863dc2f89 Merge pull request #155 from OpenPipe/move-api
Move the external API into its own router
2023-08-14 17:02:34 -07:00
Kyle Corbitt
c4cef35717 Move the external API into its own router
Auth logic isn't shared between the clients anyway, so co-locating them is confusing since you can't use the same clients to call both. This also makes the codegen clients less verbose.
2023-08-14 16:56:50 -07:00
Kyle Corbitt
8552baf632 Merge pull request #154 from OpenPipe/broken-page
Cap the number of waiting messages we try to render
2023-08-14 15:47:48 -07:00
Kyle Corbitt
f41e2229ca Cap the number of waiting messages we try to render
If an cell was attempted several hours ago and never resolved, it crashes the UI because we try to render thousands of log messages once a second (eg. https://app.openpipe.ai/experiments/372d0827-186e-4a7d-a8a6-1bf7050eb5fd) We should probably have a different UI for cells that have hung for a long time to let you know you should just retry, but this quick fix should work for now.
2023-08-14 15:44:03 -07:00
arcticfly
e649f42c9c Await completions (#153)
* Continue polling stats while waiting for completions to finish

* Clarify convert to function call instructions
2023-08-14 13:03:48 -07:00
Kyle Corbitt
99f305483b Merge pull request #150 from OpenPipe/fix-build
(Probably) fixes the build
2023-08-14 07:59:20 -07:00
arcticfly
b28f4cad57 Remove scenarios header from output table card (#151) 2023-08-13 03:26:58 -07:00
Kyle Corbitt
df4a3a0950 (Probably) fixes the build
This probably fixes the build that I broke in https://github.com/OpenPipe/OpenPipe/pull/149. However, there's a small chance that it fixes it enough to deploy, but not enough to actually work. That would be bad, so not merging until I have time to monitor the deploy.
2023-08-12 23:50:31 -07:00
David Corbitt
e423ad656a Fix ExperimentCard aspect ratio 2023-08-12 23:31:25 -07:00
Kyle Corbitt
7d0d94de3a Merge pull request #149 from OpenPipe/js-client
Load the JS client using pnpm workspaces
2023-08-12 22:56:17 -07:00
Kyle Corbitt
344b257db4 Load the JS client using pnpm workspaces
This makes it so we're using our own openpipe client for all OpenAI calls from the OpenPipe app.

The client doesn't do anything at the moment beyond proxying to the OpenAI lib. But this infra work should make it easier to quickly iterate on the client and test the changes in our own app.
2023-08-12 15:24:48 -07:00
Kyle Corbitt
28b43b6e6d Merge pull request #148 from OpenPipe/js-client
Fix client bugs
2023-08-12 10:38:49 -07:00
Kyle Corbitt
8d373ec9b5 remove unused imports 2023-08-12 10:02:23 -07:00
Kyle Corbitt
537525667d don't reload monaco every render cycle
oops
2023-08-12 09:59:07 -07:00
Kyle Corbitt
519367c553 Fix client bugs
1. PostHog can only be used client-side
2. Can't nest <a> tags in the ProjectMenu
2023-08-12 09:35:52 -07:00
Kyle Corbitt
1a338ec863 Merge pull request #147 from OpenPipe/logs-ui
Style overhaul, make logged calls selectable
2023-08-12 08:48:49 -07:00
David Corbitt
01d0b8f778 Resurrect UserMenu 2023-08-12 04:28:41 -07:00
David Corbitt
d99836ec30 Add experiment button 2023-08-12 04:18:39 -07:00
David Corbitt
33751c12d2 Allow user to select logs 2023-08-12 04:07:58 -07:00
David Corbitt
89815e1f7f Add selectedLogs, rename setSelectedProjectId 2023-08-12 03:35:54 -07:00
David Corbitt
5fa5109f34 Make cache text gray 2023-08-12 03:06:19 -07:00
David Corbitt
b06ab2cbf9 Properly show model 2023-08-12 02:58:28 -07:00
David Corbitt
35fb554038 Center Add Variant button 2023-08-12 02:48:22 -07:00
David Corbitt
f238177277 Fix variant header top right border radius 2023-08-12 02:46:09 -07:00
David Corbitt
723c0f7505 Update colors throughout app 2023-08-12 02:32:09 -07:00
David Corbitt
ce6936f753 Change overall background color and menu 2023-08-12 02:31:52 -07:00
David Corbitt
2a80cbf74a Add relative time back in 2023-08-12 02:06:56 -07:00
David Corbitt
098805ef25 Create loggedCalls.router 2023-08-12 00:03:06 -07:00
David Corbitt
ed90bc5a99 Add dashboard page 2023-08-11 23:34:53 -07:00
arcticfly
de9be8c7ce Allow custom config file (#143)
* Allow custom config file

* Temporarily remove dependency on local openpipe
2023-08-11 23:07:04 -07:00
arcticfly
3e02bcf9b8 Update paginator styles (#142)
* Change paginator icons

* Remove horizontal spacing
2023-08-11 21:11:25 -07:00
Kyle Corbitt
cef2ee31fb Merge pull request #141 from OpenPipe/python-sdk
Add caching in Python
2023-08-11 19:04:18 -07:00
arcticfly
228c547839 Add logged calls pagination (#140)
* Store model on LoggedCall

* Allow mulitple page sizes

* Add logged calls pagination
2023-08-11 19:00:09 -07:00
Kyle Corbitt
e1fcc8fb38 Merge pull request #139 from OpenPipe/python-sdk
Add a python client library
2023-08-11 17:48:07 -07:00
arcticfly
3a908d51aa Store model on LoggedCall (#138) 2023-08-11 16:39:04 -07:00
136 changed files with 4724 additions and 3520 deletions

5
.dockerignore Normal file
View File

@@ -0,0 +1,5 @@
**/node_modules/
.git
**/.venv/
**/.env*
**/.next/

2
.gitignore vendored
View File

@@ -1,3 +1,5 @@
.env
.venv/
*.pyc
node_modules/
*.tsbuildinfo

2
.prettierignore Normal file
View File

@@ -0,0 +1,2 @@
*.schema.json
app/pnpm-lock.yaml

View File

@@ -32,5 +32,5 @@ NEXT_PUBLIC_HOST="http://localhost:3000"
GITHUB_CLIENT_ID="your_client_id"
GITHUB_CLIENT_SECRET="your_secret"
OPENPIPE_BASE_URL="http://localhost:3000/api"
OPENPIPE_BASE_URL="http://localhost:3000/api/v1"
OPENPIPE_API_KEY="your_key"

View File

@@ -6,7 +6,7 @@ const config = {
overrides: [
{
extends: ["plugin:@typescript-eslint/recommended-requiring-type-checking"],
files: ["*.ts", "*.tsx"],
files: ["*.mts", "*.ts", "*.tsx"],
parserOptions: {
project: path.join(__dirname, "tsconfig.json"),
},

3
app/.gitignore vendored
View File

@@ -44,3 +44,6 @@ yarn-error.log*
# Sentry Auth Token
.sentryclirc
# custom openai intialization
src/server/utils/openaiCustomConfig.json

View File

@@ -1,2 +0,0 @@
*.schema.json
pnpm-lock.yaml

View File

@@ -12,19 +12,19 @@ declare module "nextjs-routes" {
export type Route =
| StaticRoute<"/account/signin">
| DynamicRoute<"/api/[...trpc]", { "trpc": string[] }>
| DynamicRoute<"/api/auth/[...nextauth]", { "nextauth": string[] }>
| StaticRoute<"/api/experiments/og-image">
| StaticRoute<"/api/openapi">
| StaticRoute<"/api/sentry-example-api">
| DynamicRoute<"/api/trpc/[trpc]", { "trpc": string }>
| DynamicRoute<"/api/v1/[...trpc]", { "trpc": string[] }>
| StaticRoute<"/api/v1/openapi">
| StaticRoute<"/dashboard">
| DynamicRoute<"/data/[id]", { "id": string }>
| StaticRoute<"/data">
| DynamicRoute<"/experiments/[id]", { "id": string }>
| StaticRoute<"/experiments">
| StaticRoute<"/">
| StaticRoute<"/logged-calls">
| StaticRoute<"/project/settings">
| StaticRoute<"/request-logs">
| StaticRoute<"/sentry-example-page">
| StaticRoute<"/world-champs">
| StaticRoute<"/world-champs/signup">;

View File

@@ -6,13 +6,13 @@ RUN yarn global add pnpm
# DEPS
FROM base as deps
WORKDIR /app
WORKDIR /code
COPY prisma ./
COPY app/prisma app/package.json ./app/
COPY client-libs/typescript/package.json ./client-libs/typescript/
COPY pnpm-lock.yaml pnpm-workspace.yaml ./
COPY package.json pnpm-lock.yaml ./
RUN pnpm install --frozen-lockfile
RUN cd app && pnpm install --frozen-lockfile
# BUILDER
FROM base as builder
@@ -25,22 +25,24 @@ ARG NEXT_PUBLIC_SENTRY_DSN
ARG SENTRY_AUTH_TOKEN
ARG NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
WORKDIR /code
COPY --from=deps /code/node_modules ./node_modules
COPY --from=deps /code/app/node_modules ./app/node_modules
COPY --from=deps /code/client-libs/typescript/node_modules ./client-libs/typescript/node_modules
COPY . .
RUN SKIP_ENV_VALIDATION=1 pnpm build
RUN cd app && SKIP_ENV_VALIDATION=1 pnpm build
# RUNNER
FROM base as runner
WORKDIR /app
WORKDIR /code/app
ENV NODE_ENV production
ENV NEXT_TELEMETRY_DISABLED 1
COPY --from=builder /app/ ./
COPY --from=builder /code/ /code/
EXPOSE 3000
ENV PORT 3000
# Run the "run-prod.sh" script
CMD /app/run-prod.sh
CMD /code/app/run-prod.sh

View File

@@ -36,6 +36,8 @@ let config = {
});
return config;
},
transpilePackages: ["openpipe"],
};
config = nextRoutes()(config);

View File

@@ -1,5 +1,6 @@
{
"name": "openpipe",
"name": "openpipe-app",
"private": true,
"type": "module",
"version": "0.1.0",
"license": "Apache-2.0",
@@ -9,14 +10,14 @@
},
"scripts": {
"build": "next build",
"dev:next": "next dev",
"dev:next": "TZ=UTC next dev",
"dev:wss": "pnpm tsx --watch src/wss-server.ts",
"dev:worker": "NODE_ENV='development' pnpm tsx --watch src/server/tasks/worker.ts",
"dev": "concurrently --kill-others 'pnpm dev:next' 'pnpm dev:wss' 'pnpm dev:worker'",
"postinstall": "prisma generate",
"lint": "next lint",
"start": "next start",
"codegen": "tsx src/server/scripts/client-codegen.ts",
"start": "TZ=UTC next start",
"codegen:clients": "tsx src/server/scripts/client-codegen.ts",
"seed": "tsx prisma/seed.ts",
"check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'",
"test": "pnpm vitest"
@@ -24,7 +25,6 @@
"dependencies": {
"@anthropic-ai/sdk": "^0.5.8",
"@apidevtools/json-schema-ref-parser": "^10.1.0",
"@babel/preset-typescript": "^7.22.5",
"@babel/standalone": "^7.22.9",
"@chakra-ui/anatomy": "^2.2.0",
"@chakra-ui/next-js": "^2.1.4",
@@ -72,6 +72,7 @@
"nextjs-cors": "^2.1.2",
"nextjs-routes": "^2.0.1",
"openai": "4.0.0-beta.7",
"openpipe": "workspace:*",
"pg": "^8.11.2",
"pluralize": "^8.0.0",
"posthog-js": "^1.75.3",
@@ -128,6 +129,7 @@
"eslint-plugin-unused-imports": "^2.0.0",
"monaco-editor": "^0.40.0",
"openapi-typescript": "^6.3.4",
"openapi-typescript-codegen": "^0.25.0",
"prisma": "^4.14.0",
"raw-loader": "^4.0.2",
"typescript": "^5.0.4",

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "LoggedCall" ADD COLUMN "model" TEXT;

View File

@@ -0,0 +1,22 @@
-- DropIndex
DROP INDEX "LoggedCallTag_name_idx";
DROP INDEX "LoggedCallTag_name_value_idx";
-- AlterTable: Add projectId column without NOT NULL constraint for now
ALTER TABLE "LoggedCallTag" ADD COLUMN "projectId" UUID;
-- Set the default value
UPDATE "LoggedCallTag" lct
SET "projectId" = lc."projectId"
FROM "LoggedCall" lc
WHERE lct."loggedCallId" = lc.id;
-- Now set the NOT NULL constraint
ALTER TABLE "LoggedCallTag" ALTER COLUMN "projectId" SET NOT NULL;
-- CreateIndex
CREATE INDEX "LoggedCallTag_projectId_name_idx" ON "LoggedCallTag"("projectId", "name");
CREATE INDEX "LoggedCallTag_projectId_name_value_idx" ON "LoggedCallTag"("projectId", "name", "value");
-- CreateIndex
CREATE UNIQUE INDEX "LoggedCallTag_loggedCallId_name_key" ON "LoggedCallTag"("loggedCallId", "name");

View File

@@ -112,17 +112,17 @@ model ScenarioVariantCell {
model ModelResponse {
id String @id @default(uuid()) @db.Uuid
cacheKey String
requestedAt DateTime?
receivedAt DateTime?
respPayload Json?
cost Float?
inputTokens Int?
outputTokens Int?
statusCode Int?
errorMessage String?
retryTime DateTime?
outdated Boolean @default(false)
cacheKey String
requestedAt DateTime?
receivedAt DateTime?
respPayload Json?
cost Float?
inputTokens Int?
outputTokens Int?
statusCode Int?
errorMessage String?
retryTime DateTime?
outdated Boolean @default(false)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@ -273,7 +273,8 @@ model LoggedCall {
projectId String @db.Uuid
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
tags LoggedCallTag[]
model String?
tags LoggedCallTag[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@ -294,7 +295,7 @@ model LoggedCallModelResponse {
errorMessage String?
requestedAt DateTime
receivedAt DateTime
receivedAt DateTime
// Note: the function to calculate the cacheKey should include the project
// ID so we don't share cached responses between projects, which could be an
@@ -325,12 +326,14 @@ model LoggedCallTag {
id String @id @default(uuid()) @db.Uuid
name String
value String?
projectId String @db.Uuid
loggedCallId String @db.Uuid
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
@@index([name])
@@index([name, value])
@@unique([loggedCallId, name])
@@index([projectId, name])
@@index([projectId, name, value])
}
model ApiKey {
@@ -339,8 +342,8 @@ model ApiKey {
name String
apiKey String @unique
projectId String @db.Uuid
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
projectId String @db.Uuid
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt

View File

@@ -2,6 +2,7 @@ import { prisma } from "~/server/db";
import dedent from "dedent";
import { generateNewCell } from "~/server/utils/generateNewCell";
import { promptConstructorVersion } from "~/promptConstructor/version";
import { env } from "~/env.mjs";
const defaultId = "11111111-1111-1111-1111-111111111111";
@@ -16,6 +17,16 @@ const project =
data: { id: defaultId },
}));
if (env.OPENPIPE_API_KEY) {
await prisma.apiKey.create({
data: {
projectId: project.id,
name: "Default API Key",
apiKey: env.OPENPIPE_API_KEY,
},
});
}
await prisma.experiment.deleteMany({
where: {
id: defaultId,

View File

@@ -13,6 +13,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: number;
outputTokens: number;
finishReason: string;
tags: { name: string; value: string }[];
}[] = [
{
reqPayload: {
@@ -107,6 +108,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 236,
outputTokens: 5,
finishReason: "stop",
tags: [],
},
{
reqPayload: {
@@ -193,6 +195,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 222,
outputTokens: 5,
finishReason: "stop",
tags: [],
},
{
reqPayload: {
@@ -231,6 +234,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 14,
outputTokens: 7,
finishReason: "stop",
tags: [{ name: "prompt_id", value: "id2" }],
},
{
reqPayload: {
@@ -306,6 +310,10 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 2802,
outputTokens: 108,
finishReason: "stop",
tags: [
{ name: "prompt_id", value: "chatcmpl-7lQS3MktOT8BTgNEytl9dkyssCQqL" },
{ name: "some_other_tag", value: "some_other_value" },
],
},
];
@@ -349,6 +357,7 @@ for (let i = 0; i < 1437; i++) {
cacheHit: false,
requestedAt,
projectId: project.id,
model: template.reqPayload.model,
createdAt: requestedAt,
});
@@ -388,11 +397,14 @@ for (let i = 0; i < 1437; i++) {
modelResponseId: loggedCallModelResponseId,
},
});
loggedCallTagsToCreate.push({
loggedCallId,
name: "$model",
value: template.reqPayload.model,
});
for (const tag of template.tags) {
loggedCallTagsToCreate.push({
projectId: project.id,
loggedCallId,
name: tag.name,
value: tag.value,
});
}
}
await prisma.$transaction([

View File

@@ -0,0 +1,80 @@
import {
Input,
InputGroup,
InputRightElement,
Icon,
Popover,
PopoverTrigger,
PopoverContent,
VStack,
HStack,
Button,
Text,
useDisclosure,
} from "@chakra-ui/react";
import { FiChevronDown } from "react-icons/fi";
import { BiCheck } from "react-icons/bi";
type InputDropdownProps<T> = {
options: ReadonlyArray<T>;
selectedOption: T;
onSelect: (option: T) => void;
};
const InputDropdown = <T,>({ options, selectedOption, onSelect }: InputDropdownProps<T>) => {
const popover = useDisclosure();
return (
<Popover placement="bottom-start" {...popover}>
<PopoverTrigger>
<InputGroup cursor="pointer" w={(selectedOption as string).length * 14 + 180}>
<Input
value={selectedOption as string}
// eslint-disable-next-line @typescript-eslint/no-empty-function -- controlled input requires onChange
onChange={() => {}}
cursor="pointer"
borderColor={popover.isOpen ? "blue.500" : undefined}
_hover={popover.isOpen ? { borderColor: "blue.500" } : undefined}
contentEditable={false}
// disable focus
onFocus={(e) => {
e.target.blur();
}}
/>
<InputRightElement>
<Icon as={FiChevronDown} />
</InputRightElement>
</InputGroup>
</PopoverTrigger>
<PopoverContent boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);" minW={0} w="auto">
<VStack spacing={0}>
{options?.map((option, index) => (
<HStack
key={index}
as={Button}
onClick={() => {
onSelect(option);
popover.onClose();
}}
w="full"
variant="ghost"
justifyContent="space-between"
fontWeight="semibold"
borderRadius={0}
colorScheme="blue"
color="black"
fontSize="sm"
borderBottomWidth={1}
>
<Text mr={16}>{option as string}</Text>
{option === selectedOption && <Icon as={BiCheck} color="blue.500" boxSize={5} />}
</HStack>
))}
</VStack>
</PopoverContent>
</Popover>
);
};
export default InputDropdown;

View File

@@ -33,25 +33,11 @@ export default function AddVariantButton() {
<Flex w="100%" justifyContent="flex-end">
<ActionButton
onClick={onClick}
py={5}
py={7}
leftIcon={<Icon as={loading ? Spinner : BsPlus} boxSize={6} mr={loading ? 1 : 0} />}
>
<Text display={{ base: "none", md: "flex" }}>Add Variant</Text>
</ActionButton>
{/* <Button
alignItems="center"
justifyContent="center"
fontWeight="normal"
bgColor="transparent"
_hover={{ bgColor: "gray.100" }}
px={cellPadding.x}
onClick={onClick}
height="unset"
minH={headerMinHeight}
>
<Icon as={loading ? Spinner : BsPlus} boxSize={6} mr={loading ? 1 : 0} />
<Text display={{ base: "none", md: "flex" }}>Add Variant</Text>
</Button> */}
</Flex>
);
}

View File

@@ -33,7 +33,7 @@ export default function OutputCell({
if (!templateHasVariables) disabledReason = "Add a value to the scenario variables to see output";
const [refetchInterval, setRefetchInterval] = useState(0);
const [refetchInterval, setRefetchInterval] = useState<number | false>(false);
const { data: cell, isLoading: queryLoading } = api.scenarioVariantCells.get.useQuery(
{ scenarioId: scenario.id, variantId: variant.id },
{ refetchInterval },
@@ -64,7 +64,8 @@ export default function OutputCell({
cell.retrievalStatus === "PENDING" ||
cell.retrievalStatus === "IN_PROGRESS" ||
hardRefetching;
useEffect(() => setRefetchInterval(awaitingOutput ? 1000 : 0), [awaitingOutput]);
useEffect(() => setRefetchInterval(awaitingOutput ? 1000 : false), [awaitingOutput]);
// TODO: disconnect from socket if we're not streaming anymore
const streamedMessage = useSocket<OutputSchema>(cell?.id);
@@ -120,8 +121,13 @@ export default function OutputCell({
? response.receivedAt.getTime()
: Date.now();
if (response.requestedAt) {
numWaitingMessages = Math.floor(
(relativeWaitingTime - response.requestedAt.getTime()) / WAITING_MESSAGE_INTERVAL,
numWaitingMessages = Math.min(
Math.floor(
(relativeWaitingTime - response.requestedAt.getTime()) / WAITING_MESSAGE_INTERVAL,
),
// Don't try to render more than 15, it'll use too much CPU and
// break the page
15,
);
}
return (

View File

@@ -1,21 +1,16 @@
import { type StackProps } from "@chakra-ui/react";
import { useScenarios } from "~/utils/hooks";
import Paginator from "../Paginator";
const ScenarioPaginator = () => {
const ScenarioPaginator = (props: StackProps) => {
const { data } = useScenarios();
if (!data) return null;
const { scenarios, startIndex, lastPage, count } = data;
const { count } = data;
return (
<Paginator
numItemsLoaded={scenarios.length}
startIndex={startIndex}
lastPage={lastPage}
count={count}
/>
);
return <Paginator count={count} condense {...props} />;
};
export default ScenarioPaginator;

View File

@@ -10,6 +10,8 @@ const ScenarioRow = (props: {
variants: PromptVariant[];
canHide: boolean;
rowStart: number;
isFirst: boolean;
isLast: boolean;
}) => {
const [isHovered, setIsHovered] = useState(false);
@@ -21,10 +23,14 @@ const ScenarioRow = (props: {
onMouseEnter={() => setIsHovered(true)}
onMouseLeave={() => setIsHovered(false)}
sx={isHovered ? highlightStyle : undefined}
borderLeftWidth={1}
{...borders}
bgColor="white"
rowStart={props.rowStart}
colStart={1}
borderLeftWidth={1}
borderTopWidth={props.isFirst ? 1 : 0}
borderTopLeftRadius={props.isFirst ? 8 : 0}
borderBottomLeftRadius={props.isLast ? 8 : 0}
{...borders}
>
<ScenarioEditor scenario={props.scenario} hovered={isHovered} canHide={props.canHide} />
</GridItem>
@@ -34,8 +40,12 @@ const ScenarioRow = (props: {
onMouseEnter={() => setIsHovered(true)}
onMouseLeave={() => setIsHovered(false)}
sx={isHovered ? highlightStyle : undefined}
bgColor="white"
rowStart={props.rowStart}
colStart={i + 2}
borderTopWidth={props.isFirst ? 1 : 0}
borderTopRightRadius={props.isFirst && i === props.variants.length - 1 ? 8 : 0}
borderBottomRightRadius={props.isLast && i === props.variants.length - 1 ? 8 : 0}
{...borders}
>
<OutputCell key={variant.id} scenario={props.scenario} variant={variant} />

View File

@@ -48,7 +48,7 @@ export const ScenariosHeader = () => {
);
return (
<HStack w="100%" pb={cellPadding.y} pt={0} align="center" spacing={0}>
<HStack w="100%" py={cellPadding.y} px={cellPadding.x} align="center" spacing={0}>
<Text fontSize={16} fontWeight="bold">
Scenarios ({scenarios.data?.count})
</Text>
@@ -57,11 +57,16 @@ export const ScenariosHeader = () => {
<MenuButton
as={IconButton}
mt={1}
ml={2}
variant="ghost"
aria-label="Edit Scenarios"
icon={<Icon as={loading ? Spinner : BsGear} />}
maxW={8}
minW={8}
minH={8}
maxH={8}
/>
<MenuList fontSize="md" zIndex="dropdown" mt={-3}>
<MenuList fontSize="md" zIndex="dropdown" mt={-1}>
<MenuItem
icon={<Icon as={BsPlus} boxSize={6} mx="-5px" />}
onClick={() => onAddScenario(false)}

View File

@@ -21,14 +21,18 @@ export default function VariantStats(props: { variant: PromptVariant }) {
outputTokens: 0,
scenarioCount: 0,
outputCount: 0,
awaitingCompletions: false,
awaitingEvals: false,
},
refetchInterval,
},
);
// Poll every two seconds while we are waiting for LLM retrievals to finish
useEffect(() => setRefetchInterval(data.awaitingEvals ? 5000 : 0), [data.awaitingEvals]);
// Poll every five seconds while we are waiting for LLM retrievals to finish
useEffect(
() => setRefetchInterval(data.awaitingCompletions || data.awaitingEvals ? 5000 : 0),
[data.awaitingCompletions, data.awaitingEvals],
);
const [passColor, neutralColor, failColor] = useToken("colors", [
"green.500",

View File

@@ -53,20 +53,29 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
colStart: i + 2,
borderLeftWidth: i === 0 ? 1 : 0,
marginLeft: i === 0 ? "-1px" : 0,
backgroundColor: "gray.100",
backgroundColor: "white",
};
const isFirst = i === 0;
const isLast = i === variants.data.length - 1;
return (
<Fragment key={variant.uiId}>
<VariantHeader
variant={variant}
canHide={variants.data.length > 1}
rowStart={1}
borderTopLeftRadius={isFirst ? 8 : 0}
borderTopRightRadius={isLast ? 8 : 0}
{...sharedProps}
/>
<GridItem rowStart={2} {...sharedProps}>
<VariantEditor variant={variant} />
</GridItem>
<GridItem rowStart={3} {...sharedProps}>
<GridItem
rowStart={3}
{...sharedProps}
borderBottomLeftRadius={isFirst ? 8 : 0}
borderBottomRightRadius={isLast ? 8 : 0}
>
<VariantStats variant={variant} />
</GridItem>
</Fragment>
@@ -77,7 +86,6 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
colSpan={allCols - 1}
rowStart={variantHeaderRows + 1}
colStart={1}
{...borders}
borderRightWidth={0}
>
<ScenariosHeader />
@@ -90,6 +98,8 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
scenario={scenario}
variants={variants.data}
canHide={visibleScenariosCount > 1}
isFirst={i === 0}
isLast={i === visibleScenariosCount - 1}
/>
))}
<GridItem

View File

@@ -1,77 +1,119 @@
import { Box, HStack, IconButton } from "@chakra-ui/react";
import {
BsChevronDoubleLeft,
BsChevronDoubleRight,
BsChevronLeft,
BsChevronRight,
} from "react-icons/bs";
import { usePage } from "~/utils/hooks";
import { HStack, IconButton, Text, Select, type StackProps, Icon } from "@chakra-ui/react";
import React, { useCallback } from "react";
import { FiChevronsLeft, FiChevronsRight, FiChevronLeft, FiChevronRight } from "react-icons/fi";
import { usePageParams } from "~/utils/hooks";
const pageSizeOptions = [10, 25, 50, 100];
const Paginator = ({
numItemsLoaded,
startIndex,
lastPage,
count,
}: {
numItemsLoaded: number;
startIndex: number;
lastPage: number;
count: number;
}) => {
const [page, setPage] = usePage();
condense,
...props
}: { count: number; condense?: boolean } & StackProps) => {
const { page, pageSize, setPageParams } = usePageParams();
const lastPage = Math.ceil(count / pageSize);
const updatePageSize = useCallback(
(newPageSize: number) => {
const newPage = Math.floor(((page - 1) * pageSize) / newPageSize) + 1;
setPageParams({ page: newPage, pageSize: newPageSize }, "replace");
},
[page, pageSize, setPageParams],
);
const nextPage = () => {
if (page < lastPage) {
setPage(page + 1, "replace");
setPageParams({ page: page + 1 }, "replace");
}
};
const prevPage = () => {
if (page > 1) {
setPage(page - 1, "replace");
setPageParams({ page: page - 1 }, "replace");
}
};
const goToLastPage = () => setPage(lastPage, "replace");
const goToFirstPage = () => setPage(1, "replace");
const goToLastPage = () => setPageParams({ page: lastPage }, "replace");
const goToFirstPage = () => setPageParams({ page: 1 }, "replace");
if (count === 0) return null;
return (
<HStack pt={4}>
<IconButton
variant="ghost"
size="sm"
onClick={goToFirstPage}
isDisabled={page === 1}
aria-label="Go to first page"
icon={<BsChevronDoubleLeft />}
/>
<IconButton
variant="ghost"
size="sm"
onClick={prevPage}
isDisabled={page === 1}
aria-label="Previous page"
icon={<BsChevronLeft />}
/>
<Box>
{startIndex}-{startIndex + numItemsLoaded - 1} / {count}
</Box>
<IconButton
variant="ghost"
size="sm"
onClick={nextPage}
isDisabled={page === lastPage}
aria-label="Next page"
icon={<BsChevronRight />}
/>
<IconButton
variant="ghost"
size="sm"
onClick={goToLastPage}
isDisabled={page === lastPage}
aria-label="Go to last page"
icon={<BsChevronDoubleRight />}
/>
<HStack
pt={4}
spacing={8}
justifyContent={condense ? "flex-start" : "space-between"}
alignItems="center"
w="full"
{...props}
>
{!condense && (
<>
<HStack>
<Text>Rows</Text>
<Select
value={pageSize}
onChange={(e) => updatePageSize(parseInt(e.target.value))}
w={20}
backgroundColor="white"
>
{pageSizeOptions.map((option) => (
<option key={option} value={option}>
{option}
</option>
))}
</Select>
</HStack>
<Text>
Page {page} of {lastPage}
</Text>
</>
)}
<HStack>
<IconButton
variant="outline"
size="sm"
onClick={goToFirstPage}
isDisabled={page === 1}
aria-label="Go to first page"
icon={<Icon as={FiChevronsLeft} boxSize={5} strokeWidth={1.5} />}
bgColor="white"
/>
<IconButton
variant="outline"
size="sm"
onClick={prevPage}
isDisabled={page === 1}
aria-label="Previous page"
icon={<Icon as={FiChevronLeft} boxSize={5} strokeWidth={1.5} />}
bgColor="white"
/>
{condense && (
<Text>
Page {page} of {lastPage}
</Text>
)}
<IconButton
variant="outline"
size="sm"
onClick={nextPage}
isDisabled={page === lastPage}
aria-label="Next page"
icon={<Icon as={FiChevronRight} boxSize={5} strokeWidth={1.5} />}
bgColor="white"
/>
<IconButton
variant="outline"
size="sm"
onClick={goToLastPage}
isDisabled={page === lastPage}
aria-label="Go to last page"
icon={<Icon as={FiChevronsRight} boxSize={5} strokeWidth={1.5} />}
bgColor="white"
/>
</HStack>
</HStack>
);
};

View File

@@ -75,7 +75,7 @@ export default function VariantHeader(
padding={0}
sx={{
position: "sticky",
top: "0",
top: "-2",
// Ensure that the menu always appears above the sticky header of other variants
zIndex: menuOpen ? "dropdown" : 10,
}}
@@ -84,6 +84,7 @@ export default function VariantHeader(
>
<HStack
spacing={2}
py={2}
alignItems="flex-start"
minH={headerMinHeight}
draggable={!isInputHovered}
@@ -102,7 +103,9 @@ export default function VariantHeader(
setIsDragTarget(false);
}}
onDrop={onReorder}
backgroundColor={isDragTarget ? "gray.200" : "gray.100"}
backgroundColor={isDragTarget ? "gray.200" : "white"}
borderTopLeftRadius={gridItemProps.borderTopLeftRadius}
borderTopRightRadius={gridItemProps.borderTopRightRadius}
h="full"
>
<Icon

View File

@@ -1,209 +0,0 @@
import {
Box,
Card,
CardHeader,
Heading,
Table,
Tbody,
Td,
Th,
Thead,
Tr,
Tooltip,
Collapse,
HStack,
VStack,
IconButton,
useToast,
Icon,
Button,
ButtonGroup,
} from "@chakra-ui/react";
import dayjs from "dayjs";
import relativeTime from "dayjs/plugin/relativeTime";
import { ChevronUpIcon, ChevronDownIcon, CopyIcon } from "lucide-react";
import { useMemo, useState } from "react";
import { type RouterOutputs, api } from "~/utils/api";
import SyntaxHighlighter from "react-syntax-highlighter";
import { atelierCaveLight } from "react-syntax-highlighter/dist/cjs/styles/hljs";
import stringify from "json-stringify-pretty-compact";
import Link from "next/link";
dayjs.extend(relativeTime);
type LoggedCall = RouterOutputs["dashboard"]["loggedCalls"][0];
const FormattedJson = ({ json }: { json: any }) => {
const jsonString = stringify(json, { maxLength: 40 });
const toast = useToast();
const copyToClipboard = async (text: string) => {
try {
await navigator.clipboard.writeText(text);
toast({
title: "Copied to clipboard",
status: "success",
duration: 2000,
});
} catch (err) {
toast({
title: "Failed to copy to clipboard",
status: "error",
duration: 2000,
});
}
};
return (
<Box position="relative" fontSize="sm" borderRadius="md" overflow="hidden">
<SyntaxHighlighter
customStyle={{ overflowX: "unset" }}
language="json"
style={atelierCaveLight}
lineProps={{
style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
}}
wrapLines
>
{jsonString}
</SyntaxHighlighter>
<IconButton
aria-label="Copy"
icon={<CopyIcon />}
position="absolute"
top={1}
right={1}
size="xs"
variant="ghost"
onClick={() => void copyToClipboard(jsonString)}
/>
</Box>
);
};
function TableRow({
loggedCall,
isExpanded,
onToggle,
}: {
loggedCall: LoggedCall;
isExpanded: boolean;
onToggle: () => void;
}) {
const isError = loggedCall.modelResponse?.statusCode !== 200;
const timeAgo = dayjs(loggedCall.requestedAt).fromNow();
const fullTime = dayjs(loggedCall.requestedAt).toString();
const model = useMemo(
() => loggedCall.tags.find((tag) => tag.name.startsWith("$model"))?.value,
[loggedCall.tags],
);
const durationCell = (
<Td isNumeric>
{loggedCall.cacheHit
? "Cache hit"
: ((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"}
</Td>
);
return (
<>
<Tr
onClick={onToggle}
key={loggedCall.id}
_hover={{ bgColor: "gray.100", cursor: "pointer" }}
sx={{
"> td": { borderBottom: "none" },
}}
>
<Td>
<Icon boxSize={6} as={isExpanded ? ChevronUpIcon : ChevronDownIcon} />
</Td>
<Td>
<Tooltip label={fullTime} placement="top">
<Box whiteSpace="nowrap" minW="120px">
{timeAgo}
</Box>
</Tooltip>
</Td>
<Td width="100%">{model}</Td>
{durationCell}
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
{loggedCall.modelResponse?.statusCode ?? "No response"}
</Td>
</Tr>
<Tr>
<Td colSpan={8} p={0}>
<Collapse in={isExpanded} unmountOnExit={true}>
<VStack p={4} align="stretch">
<HStack align="stretch">
<VStack flex={1} align="stretch">
<Heading size="sm">Input</Heading>
<FormattedJson json={loggedCall.modelResponse?.reqPayload} />
</VStack>
<VStack flex={1} align="stretch">
<Heading size="sm">Output</Heading>
<FormattedJson json={loggedCall.modelResponse?.respPayload} />
</VStack>
</HStack>
<ButtonGroup alignSelf="flex-end">
<Button as={Link} colorScheme="blue" href={{ pathname: "/experiments" }}>
Experiments
</Button>
</ButtonGroup>
</VStack>
</Collapse>
</Td>
</Tr>
</>
);
}
export default function LoggedCallTable() {
const [expandedRow, setExpandedRow] = useState<string | null>(null);
const loggedCalls = api.dashboard.loggedCalls.useQuery({});
return (
<Card variant="outline" width="100%" overflow="hidden">
<CardHeader>
<Heading as="h3" size="sm">
Logged Calls
</Heading>
</CardHeader>
<Table>
<Thead>
<Tr>
<Th />
<Th>Time</Th>
<Th>Model</Th>
<Th isNumeric>Duration</Th>
<Th isNumeric>Input tokens</Th>
<Th isNumeric>Output tokens</Th>
<Th isNumeric>Status</Th>
</Tr>
</Thead>
<Tbody>
{loggedCalls.data?.map((loggedCall) => {
return (
<TableRow
key={loggedCall.id}
loggedCall={loggedCall}
isExpanded={loggedCall.id === expandedRow}
onToggle={() => {
if (loggedCall.id === expandedRow) {
setExpandedRow(null);
} else {
setExpandedRow(loggedCall.id);
}
}}
/>
);
})}
</Tbody>
</Table>
</Card>
);
}

View File

@@ -0,0 +1,46 @@
import { Card, CardHeader, Heading, Table, Tbody, HStack, Button, Text } from "@chakra-ui/react";
import { useState } from "react";
import Link from "next/link";
import { useLoggedCalls } from "~/utils/hooks";
import { TableHeader, TableRow } from "../requestLogs/TableRow";
export default function LoggedCallsTable() {
const [expandedRow, setExpandedRow] = useState<string | null>(null);
const { data: loggedCalls } = useLoggedCalls();
return (
<Card width="100%" overflow="hidden">
<CardHeader>
<HStack justifyContent="space-between">
<Heading as="h3" size="sm">
Request Logs
</Heading>
<Button as={Link} href="/request-logs" variant="ghost" colorScheme="blue">
<Text>View All</Text>
</Button>
</HStack>
</CardHeader>
<Table>
<TableHeader />
<Tbody>
{loggedCalls?.calls.map((loggedCall) => {
return (
<TableRow
key={loggedCall.id}
loggedCall={loggedCall}
isExpanded={loggedCall.id === expandedRow}
onToggle={() => {
if (loggedCall.id === expandedRow) {
setExpandedRow(null);
} else {
setExpandedRow(loggedCall.id);
}
}}
/>
);
})}
</Tbody>
</Table>
</Card>
);
}

View File

@@ -1,21 +1,16 @@
import { type StackProps } from "@chakra-ui/react";
import { useDatasetEntries } from "~/utils/hooks";
import Paginator from "../Paginator";
const DatasetEntriesPaginator = () => {
const DatasetEntriesPaginator = (props: StackProps) => {
const { data } = useDatasetEntries();
if (!data) return null;
const { entries, startIndex, lastPage, count } = data;
const { count } = data;
return (
<Paginator
numItemsLoaded={entries.length}
startIndex={startIndex}
lastPage={lastPage}
count={count}
/>
);
return <Paginator count={count} {...props} />;
};
export default DatasetEntriesPaginator;

View File

@@ -7,6 +7,7 @@ import {
Spinner,
AspectRatio,
SkeletonText,
Card,
} from "@chakra-ui/react";
import { RiFlaskLine } from "react-icons/ri";
import { formatTimePast } from "~/utils/dayjs";
@@ -29,17 +30,22 @@ type ExperimentData = {
export const ExperimentCard = ({ exp }: { exp: ExperimentData }) => {
return (
<AspectRatio ratio={1.2} w="full">
<Card
w="full"
h="full"
cursor="pointer"
p={4}
bg="white"
borderRadius={4}
_hover={{ bg: "gray.100" }}
transition="background 0.2s"
aspectRatio={1.2}
>
<VStack
as={Link}
w="full"
h="full"
href={{ pathname: "/experiments/[id]", query: { id: exp.id } }}
bg="gray.50"
_hover={{ bg: "gray.100" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
justify="space-between"
>
<HStack w="full" color="gray.700" justify="center">
@@ -57,7 +63,7 @@ export const ExperimentCard = ({ exp }: { exp: ExperimentData }) => {
<Text flex={1}>Updated {formatTimePast(exp.updatedAt)}</Text>
</HStack>
</VStack>
</AspectRatio>
</Card>
);
};
@@ -89,30 +95,30 @@ export const NewExperimentCard = () => {
}, [createMutation, router, selectedProjectId]);
return (
<AspectRatio ratio={1.2} w="full">
<VStack
align="center"
justify="center"
_hover={{ cursor: "pointer", bg: "gray.50" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
onClick={createExperiment}
>
<Card
w="full"
h="full"
cursor="pointer"
p={4}
bg="white"
borderRadius={4}
_hover={{ bg: "gray.100" }}
transition="background 0.2s"
aspectRatio={1.2}
>
<VStack align="center" justify="center" w="full" h="full" p={4} onClick={createExperiment}>
<Icon as={isLoading ? Spinner : BsPlusSquare} boxSize={8} />
<Text display={{ base: "none", md: "block" }} ml={2}>
New Experiment
</Text>
</VStack>
</AspectRatio>
</Card>
);
};
export const ExperimentCardSkeleton = () => (
<AspectRatio ratio={1.2} w="full">
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="gray.50">
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="white">
<SkeletonText noOfLines={1} w="80%" />
<SkeletonText noOfLines={2} w="60%" />
<SkeletonText noOfLines={1} w="80%" />

View File

@@ -1,4 +1,4 @@
import { useState, useEffect } from "react";
import { useState, useEffect, useRef } from "react";
import {
Heading,
VStack,
@@ -9,14 +9,14 @@ import {
Box,
Link as ChakraLink,
Flex,
useBreakpointValue,
} from "@chakra-ui/react";
import Head from "next/head";
import Link from "next/link";
import { BsGearFill, BsGithub, BsPersonCircle } from "react-icons/bs";
import { IoStatsChartOutline } from "react-icons/io5";
import { RiDatabase2Line, RiFlaskLine } from "react-icons/ri";
import { RiHome3Line, RiDatabase2Line, RiFlaskLine } from "react-icons/ri";
import { signIn, useSession } from "next-auth/react";
import UserMenu from "./UserMenu";
import { env } from "~/env.mjs";
import ProjectMenu from "./ProjectMenu";
import NavSidebarOption from "./NavSidebarOption";
@@ -27,10 +27,16 @@ const Divider = () => <Box h="1px" bgColor="gray.300" w="full" />;
const NavSidebar = () => {
const user = useSession().data;
// Hack to get around initial flash, see https://github.com/chakra-ui/chakra-ui/issues/6452
const isMobile = useBreakpointValue({ base: true, md: false, ssr: false });
const renderCount = useRef(0);
renderCount.current++;
const displayLogo = isMobile && renderCount.current > 1;
return (
<VStack
align="stretch"
bgColor="gray.50"
py={2}
px={2}
pb={0}
@@ -40,32 +46,59 @@ const NavSidebar = () => {
borderRightWidth={1}
borderColor="gray.300"
>
<HStack
as={Link}
href="/"
_hover={{ textDecoration: "none" }}
spacing={{ base: 1, md: 0 }}
mx={2}
py={{ base: 1, md: 2 }}
>
<Image src="/logo.svg" alt="" boxSize={6} mr={4} ml={{ base: 0.5, md: 0 }} />
<Heading size="md" fontFamily="inconsolata, monospace">
OpenPipe
</Heading>
</HStack>
<Divider />
{displayLogo && (
<>
<HStack
as={Link}
href="/"
_hover={{ textDecoration: "none" }}
spacing={{ base: 1, md: 0 }}
mx={2}
py={{ base: 1, md: 2 }}
>
<Image src="/logo.svg" alt="" boxSize={6} mr={4} ml={{ base: 0.5, md: 0 }} />
<Heading size="md" fontFamily="inconsolata, monospace">
OpenPipe
</Heading>
</HStack>
<Divider />
</>
)}
<VStack align="flex-start" overflowY="auto" overflowX="hidden" flex={1}>
{user != null && (
<>
<ProjectMenu />
<Divider />
{env.NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS && (
<IconLink icon={IoStatsChartOutline} label="Logged Calls" href="/logged-calls" beta />
<>
<IconLink icon={RiHome3Line} label="Dashboard" href="/dashboard" beta />
<IconLink
icon={IoStatsChartOutline}
label="Request Logs"
href="/request-logs"
beta
/>
</>
)}
<IconLink icon={RiFlaskLine} label="Experiments" href="/experiments" />
{env.NEXT_PUBLIC_SHOW_DATA && (
<IconLink icon={RiDatabase2Line} label="Data" href="/data" />
)}
<VStack w="full" alignItems="flex-start" spacing={0} pt={8}>
<Text
pl={2}
pb={2}
fontSize="xs"
fontWeight="bold"
color="gray.500"
display={{ base: "none", md: "flex" }}
>
CONFIGURATION
</Text>
<IconLink icon={BsGearFill} label="Project Settings" href="/project/settings" />
</VStack>
</>
)}
{user === null && (
@@ -87,20 +120,7 @@ const NavSidebar = () => {
</NavSidebarOption>
)}
</VStack>
<VStack w="full" alignItems="flex-start" spacing={0}>
<Text
pl={2}
pb={2}
fontSize="xs"
fontWeight="bold"
color="gray.500"
display={{ base: "none", md: "flex" }}
>
CONFIGURATION
</Text>
<IconLink icon={BsGearFill} label="Project Settings" href="/project/settings" />
</VStack>
{user && <UserMenu user={user} borderColor={"gray.200"} />}
<Divider />
<VStack spacing={0} align="center">
<ChakraLink
@@ -160,7 +180,7 @@ export default function AppShell({
<title>{title ? `${title} | OpenPipe` : "OpenPipe"}</title>
</Head>
<NavSidebar />
<Box h="100%" flex={1} overflowY="auto">
<Box h="100%" flex={1} overflowY="auto" bgColor="gray.50">
{children}
</Box>
</Flex>

View File

@@ -6,16 +6,18 @@ import {
PopoverTrigger,
PopoverContent,
Flex,
IconButton,
Icon,
Divider,
Button,
useDisclosure,
Spinner,
Link as ChakraLink,
Image,
Box,
} from "@chakra-ui/react";
import React, { useEffect, useState } from "react";
import Link from "next/link";
import { BsChevronRight, BsGear, BsPlus } from "react-icons/bs";
import { BsPlus, BsPersonCircle } from "react-icons/bs";
import { type Project } from "@prisma/client";
import { useAppStore } from "~/state/store";
@@ -23,13 +25,14 @@ import { api } from "~/utils/api";
import NavSidebarOption from "./NavSidebarOption";
import { useHandledAsyncCallback, useSelectedProject } from "~/utils/hooks";
import { useRouter } from "next/router";
import { useSession, signOut } from "next-auth/react";
export default function ProjectMenu() {
const router = useRouter();
const utils = api.useContext();
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
const setselectedProjectId = useAppStore((s) => s.setselectedProjectId);
const setSelectedProjectId = useAppStore((s) => s.setSelectedProjectId);
const { data: projects } = api.projects.list.useQuery();
@@ -39,9 +42,9 @@ export default function ProjectMenu() {
projects[0] &&
(!selectedProjectId || !projects.find((proj) => proj.id === selectedProjectId))
) {
setselectedProjectId(projects[0].id);
setSelectedProjectId(projects[0].id);
}
}, [selectedProjectId, setselectedProjectId, projects]);
}, [selectedProjectId, setSelectedProjectId, projects]);
const { data: selectedProject } = useSelectedProject();
@@ -49,28 +52,32 @@ export default function ProjectMenu() {
const createMutation = api.projects.create.useMutation();
const [createProject, isLoading] = useHandledAsyncCallback(async () => {
const newProj = await createMutation.mutateAsync({ name: "New Project" });
const newProj = await createMutation.mutateAsync({ name: "Untitled Project" });
await utils.projects.list.invalidate();
setselectedProjectId(newProj.id);
setSelectedProjectId(newProj.id);
await router.push({ pathname: "/project/settings" });
}, [createMutation, router]);
const user = useSession().data;
const profileImage = user?.user.image ? (
<Image src={user.user.image} alt="profile picture" boxSize={6} borderRadius="50%" />
) : (
<Icon as={BsPersonCircle} boxSize={6} />
);
return (
<VStack w="full" alignItems="flex-start" spacing={0}>
<Text
pl={2}
pb={2}
fontSize="xs"
fontWeight="bold"
color="gray.500"
display={{ base: "none", md: "flex" }}
<VStack w="full" alignItems="flex-start" spacing={0} py={1}>
<Popover
placement="bottom"
isOpen={popover.isOpen}
onOpen={popover.onOpen}
onClose={popover.onClose}
closeOnBlur
>
PROJECT
</Text>
<Popover placement="right-end" isOpen={popover.isOpen} onClose={popover.onClose} closeOnBlur>
<PopoverTrigger>
<NavSidebarOption>
<HStack w="full" onClick={popover.onToggle}>
<HStack w="full">
<Flex
p={1}
borderRadius={4}
@@ -83,20 +90,35 @@ export default function ProjectMenu() {
>
<Text>{selectedProject?.name[0]?.toUpperCase()}</Text>
</Flex>
<Text fontSize="sm" display={{ base: "none", md: "block" }} py={1} flex={1}>
<Text
fontSize="sm"
display={{ base: "none", md: "block" }}
py={1}
flex={1}
fontWeight="bold"
>
{selectedProject?.name}
</Text>
<Icon as={BsChevronRight} boxSize={4} color="gray.500" />
<Box mr={2}>{profileImage}</Box>
</HStack>
</NavSidebarOption>
</PopoverTrigger>
<PopoverContent _focusVisible={{ outline: "unset" }} ml={-1} w="auto" minW={100} maxW={280}>
<VStack alignItems="flex-start" spacing={2} py={4} px={2}>
<Text color="gray.500" fontSize="xs" fontWeight="bold" pb={1}>
PROJECTS
<PopoverContent
_focusVisible={{ outline: "unset" }}
ml={-1}
w={224}
boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);"
fontSize="sm"
>
<VStack alignItems="flex-start" spacing={1} py={1}>
<Text px={3} py={2}>
{user?.user.email}
</Text>
<Divider />
<VStack spacing={0} w="full">
<Text alignSelf="flex-start" fontWeight="bold" px={3} pt={2}>
Your Projects
</Text>
<VStack spacing={0} w="full" px={1}>
{projects?.map((proj) => (
<ProjectOption
key={proj.id}
@@ -105,19 +127,38 @@ export default function ProjectMenu() {
onClose={popover.onClose}
/>
))}
<HStack
as={Button}
variant="ghost"
colorScheme="blue"
color="blue.400"
fontSize="sm"
justifyContent="flex-start"
onClick={createProject}
w="full"
borderRadius={4}
spacing={0}
>
<Text>Add project</Text>
<Icon as={isLoading ? Spinner : BsPlus} boxSize={4} strokeWidth={0.5} />
</HStack>
</VStack>
<Divider />
<VStack w="full" px={1}>
<ChakraLink
onClick={() => {
signOut().catch(console.error);
}}
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
w="full"
py={2}
px={2}
borderRadius={4}
>
<Text>Sign out</Text>
</ChakraLink>
</VStack>
<HStack
as={Button}
variant="ghost"
colorScheme="blue"
color="blue.400"
pr={8}
w="full"
onClick={createProject}
>
<Icon as={isLoading ? Spinner : BsPlus} boxSize={6} />
<Text>New project</Text>
</HStack>
</VStack>
</PopoverContent>
</Popover>
@@ -134,38 +175,27 @@ const ProjectOption = ({
isActive: boolean;
onClose: () => void;
}) => {
const setselectedProjectId = useAppStore((s) => s.setselectedProjectId);
const setSelectedProjectId = useAppStore((s) => s.setSelectedProjectId);
const [gearHovered, setGearHovered] = useState(false);
return (
<HStack
as={Link}
href="/experiments"
onClick={() => {
setselectedProjectId(proj.id);
setSelectedProjectId(proj.id);
onClose();
}}
w="full"
justifyContent="space-between"
bgColor={isActive ? "gray.100" : "transparent"}
_hover={gearHovered ? undefined : { bgColor: "gray.200", textDecoration: "none" }}
p={2}
color={isActive ? "blue.400" : undefined}
py={2}
px={4}
borderRadius={4}
spacing={4}
>
<Text>{proj.name}</Text>
<IconButton
as={Link}
href="/project/settings"
aria-label={`Open ${proj.name} settings`}
icon={<Icon as={BsGear} boxSize={5} strokeWidth={0.5} color="gray.500" />}
variant="ghost"
size="xs"
p={0}
onMouseEnter={() => setGearHovered(true)}
onMouseLeave={() => setGearHovered(false)}
_hover={{ bgColor: isActive ? "gray.300" : "gray.100", transitionDelay: 0 }}
borderRadius={4}
/>
</HStack>
);
};

View File

@@ -0,0 +1,30 @@
import { Button, HStack, type ButtonProps, Icon, Text } from "@chakra-ui/react";
import { type IconType } from "react-icons";
const ActionButton = ({
icon,
label,
...buttonProps
}: { icon: IconType; label: string } & ButtonProps) => {
return (
<Button
colorScheme="blue"
color="black"
bgColor="white"
borderColor="gray.300"
borderRadius={4}
variant="outline"
size="sm"
fontSize="sm"
fontWeight="normal"
{...buttonProps}
>
<HStack spacing={1}>
{icon && <Icon as={icon} />}
<Text>{label}</Text>
</HStack>
</Button>
);
};
export default ActionButton;

View File

@@ -0,0 +1,55 @@
import { Box, IconButton, useToast } from "@chakra-ui/react";
import { CopyIcon } from "lucide-react";
import SyntaxHighlighter from "react-syntax-highlighter";
import { atelierCaveLight } from "react-syntax-highlighter/dist/cjs/styles/hljs";
import stringify from "json-stringify-pretty-compact";
const FormattedJson = ({ json }: { json: any }) => {
const jsonString = stringify(json, { maxLength: 40 });
const toast = useToast();
const copyToClipboard = async (text: string) => {
try {
await navigator.clipboard.writeText(text);
toast({
title: "Copied to clipboard",
status: "success",
duration: 2000,
});
} catch (err) {
toast({
title: "Failed to copy to clipboard",
status: "error",
duration: 2000,
});
}
};
return (
<Box position="relative" fontSize="sm" borderRadius="md" overflow="hidden">
<SyntaxHighlighter
customStyle={{ overflowX: "unset" }}
language="json"
style={atelierCaveLight}
lineProps={{
style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
}}
wrapLines
>
{jsonString}
</SyntaxHighlighter>
<IconButton
aria-label="Copy"
icon={<CopyIcon />}
position="absolute"
top={1}
right={1}
size="xs"
variant="ghost"
onClick={() => void copyToClipboard(jsonString)}
/>
</Box>
);
};
export { FormattedJson };

View File

@@ -0,0 +1,23 @@
import { Button, HStack, Icon, Text } from "@chakra-ui/react";
import { BsPlus } from "react-icons/bs";
import { comparators, defaultFilterableFields } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
const AddFilterButton = () => {
const addFilter = useAppStore((s) => s.logFilters.addFilter);
return (
<HStack
as={Button}
variant="ghost"
onClick={() => addFilter({ field: defaultFilterableFields[0], comparator: comparators[0] })}
spacing={0}
fontSize="sm"
>
<Icon as={BsPlus} boxSize={5} />
<Text>Add Filter</Text>
</HStack>
);
};
export default AddFilterButton;

View File

@@ -0,0 +1,49 @@
import { useCallback, useState } from "react";
import { HStack, IconButton, Input } from "@chakra-ui/react";
import { BsTrash } from "react-icons/bs";
import { type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import { debounce } from "lodash-es";
import SelectFieldDropdown from "./SelectFieldDropdown";
import SelectComparatorDropdown from "./SelectComparatorDropdown";
const LogFilter = ({ filter, index }: { filter: LogFilter; index: number }) => {
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const deleteFilter = useAppStore((s) => s.logFilters.deleteFilter);
const [editedValue, setEditedValue] = useState("");
const debouncedUpdateFilter = useCallback(
debounce(
(index: number, filter: LogFilter) => {
console.log("updating filter!!!");
updateFilter(index, filter);
},
200,
{ leading: true },
),
[updateFilter],
);
return (
<HStack>
<SelectFieldDropdown filter={filter} index={index} />
<SelectComparatorDropdown filter={filter} index={index} />
<Input
value={editedValue}
onChange={(e) => {
setEditedValue(e.target.value);
debouncedUpdateFilter(index, { ...filter, value: e.target.value });
}}
/>
<IconButton
aria-label="Delete Filter"
icon={<BsTrash />}
onClick={() => deleteFilter(index)}
/>
</HStack>
);
};
export default LogFilter;

View File

@@ -0,0 +1,30 @@
import { VStack, Text } from "@chakra-ui/react";
import AddFilterButton from "./AddFilterButton";
import { useAppStore } from "~/state/store";
import LogFilter from "./LogFilter";
const LogFilters = () => {
const filters = useAppStore((s) => s.logFilters.filters);
return (
<VStack
bgColor="white"
borderRadius={8}
borderWidth={1}
w="full"
alignItems="flex-start"
p={4}
spacing={4}
>
<Text fontWeight="bold" color="gray.500">
Filters
</Text>
{filters.map((filter, index) => (
<LogFilter key={index} filter={filter} index={index} />
))}
<AddFilterButton />
</VStack>
);
};
export default LogFilters;

View File

@@ -0,0 +1,19 @@
import { comparators, type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import InputDropdown from "~/components/InputDropdown";
const SelectComparatorDropdown = ({ filter, index }: { filter: LogFilter; index: number }) => {
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const { comparator } = filter;
return (
<InputDropdown
options={comparators}
selectedOption={comparator}
onSelect={(option) => updateFilter(index, { ...filter, comparator: option })}
/>
);
};
export default SelectComparatorDropdown;

View File

@@ -0,0 +1,22 @@
import { defaultFilterableFields, type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import { useTagNames } from "~/utils/hooks";
import InputDropdown from "~/components/InputDropdown";
const SelectFieldDropdown = ({ filter, index }: { filter: LogFilter; index: number }) => {
const tagNames = useTagNames().data;
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const { field } = filter;
return (
<InputDropdown
options={[...defaultFilterableFields, ...(tagNames || [])]}
selectedOption={field}
onSelect={(option) => updateFilter(index, { ...filter, field: option })}
/>
);
};
export default SelectFieldDropdown;

View File

@@ -0,0 +1,16 @@
import { type StackProps } from "@chakra-ui/react";
import { useLoggedCalls } from "~/utils/hooks";
import Paginator from "../Paginator";
const LoggedCallsPaginator = (props: StackProps) => {
const { data } = useLoggedCalls();
if (!data) return null;
const { count } = data;
return <Paginator count={count} {...props} />;
};
export default LoggedCallsPaginator;

View File

@@ -0,0 +1,36 @@
import { Card, Table, Tbody } from "@chakra-ui/react";
import { useState } from "react";
import { useLoggedCalls } from "~/utils/hooks";
import { TableHeader, TableRow } from "./TableRow";
export default function LoggedCallsTable() {
const [expandedRow, setExpandedRow] = useState<string | null>(null);
const loggedCalls = useLoggedCalls().data;
return (
<Card width="100%" overflowX="auto">
<Table>
<TableHeader showCheckbox />
<Tbody>
{loggedCalls?.calls?.map((loggedCall) => {
return (
<TableRow
key={loggedCall.id}
loggedCall={loggedCall}
isExpanded={loggedCall.id === expandedRow}
onToggle={() => {
if (loggedCall.id === expandedRow) {
setExpandedRow(null);
} else {
setExpandedRow(loggedCall.id);
}
}}
showCheckbox
/>
);
})}
</Tbody>
</Table>
</Card>
);
}

View File

@@ -0,0 +1,170 @@
import {
Box,
Heading,
Td,
Tr,
Thead,
Th,
Tooltip,
Collapse,
HStack,
VStack,
Button,
ButtonGroup,
Text,
Checkbox,
} from "@chakra-ui/react";
import dayjs from "dayjs";
import relativeTime from "dayjs/plugin/relativeTime";
import Link from "next/link";
import { type RouterOutputs } from "~/utils/api";
import { FormattedJson } from "./FormattedJson";
import { useAppStore } from "~/state/store";
import { useLoggedCalls, useTagNames } from "~/utils/hooks";
import { useMemo } from "react";
dayjs.extend(relativeTime);
type LoggedCall = RouterOutputs["loggedCalls"]["list"]["calls"][0];
export const TableHeader = ({ showCheckbox }: { showCheckbox?: boolean }) => {
const matchingLogIds = useLoggedCalls().data?.matchingLogIds;
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const addAll = useAppStore((s) => s.selectedLogs.addSelectedLogIds);
const clearAll = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
const allSelected = useMemo(() => {
if (!matchingLogIds || !matchingLogIds.length) return false;
return matchingLogIds.every((id) => selectedLogIds.has(id));
}, [selectedLogIds, matchingLogIds]);
const tagNames = useTagNames().data;
return (
<Thead>
<Tr>
{showCheckbox && (
<Th pr={0}>
<HStack minW={16}>
<Checkbox
isChecked={allSelected}
onChange={() => {
allSelected ? clearAll() : addAll(matchingLogIds || []);
}}
/>
<Text>
({selectedLogIds.size ? `${selectedLogIds.size}/` : ""}
{matchingLogIds?.length || 0})
</Text>
</HStack>
</Th>
)}
<Th>Sent At</Th>
<Th>Model</Th>
{tagNames?.map((tagName) => <Th key={tagName}>{tagName}</Th>)}
<Th isNumeric>Duration</Th>
<Th isNumeric>Input tokens</Th>
<Th isNumeric>Output tokens</Th>
<Th isNumeric>Status</Th>
</Tr>
</Thead>
);
};
export const TableRow = ({
loggedCall,
isExpanded,
onToggle,
showCheckbox,
}: {
loggedCall: LoggedCall;
isExpanded: boolean;
onToggle: () => void;
showCheckbox?: boolean;
}) => {
const isError = loggedCall.modelResponse?.statusCode !== 200;
const requestedAt = dayjs(loggedCall.requestedAt).format("MMMM D h:mm A");
const fullTime = dayjs(loggedCall.requestedAt).toString();
const isChecked = useAppStore((s) => s.selectedLogs.selectedLogIds.has(loggedCall.id));
const toggleChecked = useAppStore((s) => s.selectedLogs.toggleSelectedLogId);
const tagNames = useTagNames().data;
return (
<>
<Tr
onClick={onToggle}
key={loggedCall.id}
_hover={{ bgColor: "gray.50", cursor: "pointer" }}
sx={{
"> td": { borderBottom: "none" },
}}
fontSize="sm"
>
{showCheckbox && (
<Td>
<Checkbox isChecked={isChecked} onChange={() => toggleChecked(loggedCall.id)} />
</Td>
)}
<Td>
<Tooltip label={fullTime} placement="top">
<Box whiteSpace="nowrap" minW="120px">
{requestedAt}
</Box>
</Tooltip>
</Td>
<Td>
<HStack justifyContent="flex-start">
<Text
colorScheme="purple"
color="purple.500"
borderColor="purple.500"
px={1}
borderRadius={4}
borderWidth={1}
fontSize="xs"
whiteSpace="nowrap"
>
{loggedCall.model}
</Text>
</HStack>
</Td>
{tagNames?.map((tagName) => <Td key={tagName}>{loggedCall.tags[tagName]}</Td>)}
<Td isNumeric>
{loggedCall.cacheHit ? (
<Text color="gray.500">Cached</Text>
) : (
((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"
)}
</Td>
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
{loggedCall.modelResponse?.statusCode ?? "No response"}
</Td>
</Tr>
<Tr>
<Td colSpan={8} p={0}>
<Collapse in={isExpanded} unmountOnExit={true}>
<VStack p={4} align="stretch">
<HStack align="stretch">
<VStack flex={1} align="stretch">
<Heading size="sm">Input</Heading>
<FormattedJson json={loggedCall.modelResponse?.reqPayload} />
</VStack>
<VStack flex={1} align="stretch">
<Heading size="sm">Output</Heading>
<FormattedJson json={loggedCall.modelResponse?.respPayload} />
</VStack>
</HStack>
<ButtonGroup alignSelf="flex-end">
<Button as={Link} colorScheme="blue" href={{ pathname: "/experiments" }}>
Experiments
</Button>
</ButtonGroup>
</VStack>
</Collapse>
</Td>
</Tr>
</>
);
};

View File

@@ -1,54 +1,10 @@
/* eslint-disable @typescript-eslint/no-unsafe-call */
import {
type ChatCompletionChunk,
type ChatCompletion,
type CompletionCreateParams,
} from "openai/resources/chat";
import { type CompletionResponse } from "../types";
import { isArray, isString, omit } from "lodash-es";
import { openai } from "~/server/utils/openai";
import { isArray, isString } from "lodash-es";
import { APIError } from "openai";
const mergeStreamedChunks = (
base: ChatCompletion | null,
chunk: ChatCompletionChunk,
): ChatCompletion => {
if (base === null) {
return mergeStreamedChunks({ ...chunk, choices: [] }, chunk);
}
const choices = [...base.choices];
for (const choice of chunk.choices) {
const baseChoice = choices.find((c) => c.index === choice.index);
if (baseChoice) {
baseChoice.finish_reason = choice.finish_reason ?? baseChoice.finish_reason;
baseChoice.message = baseChoice.message ?? { role: "assistant" };
if (choice.delta?.content)
baseChoice.message.content =
((baseChoice.message.content as string) ?? "") + (choice.delta.content ?? "");
if (choice.delta?.function_call) {
const fnCall = baseChoice.message.function_call ?? {};
fnCall.name =
((fnCall.name as string) ?? "") + ((choice.delta.function_call.name as string) ?? "");
fnCall.arguments =
((fnCall.arguments as string) ?? "") +
((choice.delta.function_call.arguments as string) ?? "");
}
} else {
// @ts-expect-error the types are correctly telling us that finish_reason
// could be null, but don't want to fix it right now.
choices.push({ ...omit(choice, "delta"), message: { role: "assistant", ...choice.delta } });
}
}
const merged: ChatCompletion = {
...base,
choices,
};
return merged;
};
import { type ChatCompletion, type CompletionCreateParams } from "openai/resources/chat";
import mergeChunks from "openpipe/src/openai/mergeChunks";
import { openai } from "~/server/utils/openai";
import { type CompletionResponse } from "../types";
export async function getCompletion(
input: CompletionCreateParams,
@@ -59,7 +15,6 @@ export async function getCompletion(
try {
if (onStream) {
console.log("got started");
const resp = await openai.chat.completions.create(
{ ...input, stream: true },
{
@@ -67,11 +22,9 @@ export async function getCompletion(
},
);
for await (const part of resp) {
console.log("got part", part);
finalCompletion = mergeStreamedChunks(finalCompletion, part);
finalCompletion = mergeChunks(finalCompletion, part);
onStream(finalCompletion);
}
console.log("got final", finalCompletion);
if (!finalCompletion) {
return {
type: "error",

View File

@@ -120,9 +120,9 @@ export const refinementActions: Record<string, RefinementAction> = {
"Convert to function call": {
icon: TfiThought,
description: "Use function calls to get output from the model in a more structured way.",
instructions: `OpenAI functions are a specialized way for an LLM to return output.
instructions: `OpenAI functions are a specialized way for an LLM to return its final output.
This is what a prompt looks like before adding a function:
Example 1 before:
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
@@ -139,7 +139,7 @@ export const refinementActions: Record<string, RefinementAction> = {
],
});
This is what one looks like after adding a function:
Example 1 after:
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
@@ -156,7 +156,7 @@ export const refinementActions: Record<string, RefinementAction> = {
],
functions: [
{
name: "extract_sentiment",
name: "log_extracted_sentiment",
parameters: {
type: "object", // parameters must always be an object with a properties key
properties: { // properties key is required
@@ -169,13 +169,13 @@ export const refinementActions: Record<string, RefinementAction> = {
},
],
function_call: {
name: "extract_sentiment",
name: "log_extracted_sentiment",
},
});
Here's another example of adding a function:
Before:
=========
Example 2 before:
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo",
@@ -197,7 +197,7 @@ export const refinementActions: Record<string, RefinementAction> = {
temperature: 0,
});
After:
Example 2 after:
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo",
@@ -215,7 +215,7 @@ export const refinementActions: Record<string, RefinementAction> = {
temperature: 0,
functions: [
{
name: "score_post",
name: "log_post_score",
parameters: {
type: "object",
properties: {
@@ -227,13 +227,13 @@ export const refinementActions: Record<string, RefinementAction> = {
},
],
function_call: {
name: "score_post",
name: "log_post_score",
},
});
Another example
=========
Before:
Example 3 before:
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo",
@@ -246,7 +246,7 @@ export const refinementActions: Record<string, RefinementAction> = {
],
});
After:
Example 3 after:
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo",
@@ -258,21 +258,24 @@ export const refinementActions: Record<string, RefinementAction> = {
],
functions: [
{
name: "write_in_language",
name: "log_translated_text",
parameters: {
type: "object",
properties: {
text: {
translated_text: {
type: "string",
description: "The text, written in the language specified in the prompt",
},
},
},
},
],
function_call: {
name: "write_in_language",
name: "log_translated_text",
},
});
=========
Add an OpenAI function that takes one or more nested parameters that match the expected output from this prompt.`,
},

View File

@@ -8,7 +8,7 @@ import { ChakraThemeProvider } from "~/theme/ChakraThemeProvider";
import { SyncAppStore } from "~/state/sync";
import NextAdapterApp from "next-query-params/app";
import { QueryParamProvider } from "use-query-params";
import { SessionIdentifier } from "~/utils/analytics/clientAnalytics";
import { PosthogAppProvider } from "~/utils/analytics/posthog";
const MyApp: AppType<{ session: Session | null }> = ({
Component,
@@ -34,14 +34,15 @@ const MyApp: AppType<{ session: Session | null }> = ({
<meta name="twitter:image" content="/og.png" />
</Head>
<SessionProvider session={session}>
<SyncAppStore />
<Favicon />
<SessionIdentifier />
<ChakraThemeProvider>
<QueryParamProvider adapter={NextAdapterApp}>
<Component {...pageProps} />
</QueryParamProvider>
</ChakraThemeProvider>
<PosthogAppProvider>
<SyncAppStore />
<Favicon />
<ChakraThemeProvider>
<QueryParamProvider adapter={NextAdapterApp}>
<Component {...pageProps} />
</QueryParamProvider>
</ChakraThemeProvider>
</PosthogAppProvider>
</SessionProvider>
</>
);

View File

@@ -1,6 +0,0 @@
// A faulty API route to test Sentry's error monitoring
// @ts-expect-error just a test file, don't care about types
export default function handler(_req, res) {
throw new Error("Sentry Example API Route Error");
res.status(200).json({ name: "John Doe" });
}

View File

@@ -1,17 +1,14 @@
import { type NextApiRequest, type NextApiResponse } from "next";
import cors from "nextjs-cors";
import { createOpenApiNextHandler } from "trpc-openapi";
import { createProcedureCache } from "trpc-openapi/dist/adapters/node-http/procedures";
import { appRouter } from "~/server/api/root.router";
import { createTRPCContext } from "~/server/api/trpc";
import { v1ApiRouter } from "~/server/api/external/v1Api.router";
import { createOpenApiContext } from "~/server/api/external/openApiTrpc";
const openApiHandler = createOpenApiNextHandler({
router: appRouter,
createContext: createTRPCContext,
router: v1ApiRouter,
createContext: createOpenApiContext,
});
const cache = createProcedureCache(appRouter);
const handler = async (req: NextApiRequest, res: NextApiResponse) => {
// Setup CORS
await cors(req, res);

View File

@@ -1,12 +1,12 @@
import { type NextApiRequest, type NextApiResponse } from "next";
import { generateOpenApiDocument } from "trpc-openapi";
import { appRouter } from "~/server/api/root.router";
import { v1ApiRouter } from "~/server/api/external/v1Api.router";
export const openApiDocument = generateOpenApiDocument(appRouter, {
export const openApiDocument = generateOpenApiDocument(v1ApiRouter, {
title: "OpenPipe API",
description: "The public API for reporting API calls to OpenPipe",
version: "0.1.0",
baseUrl: "https://app.openpipe.ai/api",
version: "0.1.1",
baseUrl: "https://app.openpipe.ai/api/v1",
});
// Respond with our OpenAPI schema
const hander = (req: NextApiRequest, res: NextApiResponse) => {

View File

@@ -15,20 +15,16 @@ import {
Tr,
Td,
Divider,
Breadcrumb,
BreadcrumbItem,
} from "@chakra-ui/react";
import { Ban, DollarSign, Hash } from "lucide-react";
import AppShell from "~/components/nav/AppShell";
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
import { useSelectedProject } from "~/utils/hooks";
import { api } from "~/utils/api";
import LoggedCallTable from "~/components/dashboard/LoggedCallTable";
import LoggedCallsTable from "~/components/dashboard/LoggedCallsTable";
import UsageGraph from "~/components/dashboard/UsageGraph";
export default function LoggedCalls() {
export default function Dashboard() {
const { data: selectedProject } = useSelectedProject();
const stats = api.dashboard.stats.useQuery(
@@ -37,25 +33,15 @@ export default function LoggedCalls() {
);
return (
<AppShell title="Logged Calls" requireAuth>
<PageHeaderContainer>
<Breadcrumb>
<BreadcrumbItem>
<ProjectBreadcrumbContents />
</BreadcrumbItem>
<BreadcrumbItem isCurrentPage>
<Text>Logged Calls</Text>
</BreadcrumbItem>
</Breadcrumb>
</PageHeaderContainer>
<VStack px={8} pt={4} alignItems="flex-start" spacing={4}>
<AppShell title="Dashboard" requireAuth>
<VStack px={8} py={8} alignItems="flex-start" spacing={4}>
<Text fontSize="2xl" fontWeight="bold">
{selectedProject?.name}
Dashboard
</Text>
<Divider />
<VStack margin="auto" spacing={4} align="stretch" w="full">
<HStack gap={4} align="start">
<Card variant="outline" flex={1}>
<Card flex={1}>
<CardHeader>
<Heading as="h3" size="sm">
Usage Statistics
@@ -66,7 +52,7 @@ export default function LoggedCalls() {
</CardBody>
</Card>
<VStack spacing="4" width="300px" align="stretch">
<Card variant="outline">
<Card>
<CardBody>
<Stat>
<HStack>
@@ -79,7 +65,7 @@ export default function LoggedCalls() {
</Stat>
</CardBody>
</Card>
<Card variant="outline">
<Card>
<CardBody>
<Stat>
<HStack>
@@ -94,7 +80,7 @@ export default function LoggedCalls() {
</Stat>
</CardBody>
</Card>
<Card variant="outline" overflow="hidden">
<Card overflow="hidden">
<Stat>
<CardHeader>
<HStack>
@@ -120,7 +106,7 @@ export default function LoggedCalls() {
</Card>
</VStack>
</HStack>
<LoggedCallTable />
<LoggedCallsTable />
</VStack>
</VStack>
</AppShell>

View File

@@ -62,7 +62,7 @@ export default function Experiment() {
useEffect(() => {
useAppStore.getState().sharedVariantEditor.loadMonaco().catch(console.error);
});
}, []);
const [label, setLabel] = useState(experiment.data?.label || "");
useEffect(() => {

View File

@@ -65,7 +65,7 @@ export default function Settings() {
</BreadcrumbItem>
</Breadcrumb>
</PageHeaderContainer>
<VStack px={8} pt={4} alignItems="flex-start" spacing={4}>
<VStack px={8} py={4} alignItems="flex-start" spacing={4}>
<VStack spacing={0} alignItems="flex-start">
<Text fontSize="2xl" fontWeight="bold">
Project Settings
@@ -80,6 +80,7 @@ export default function Settings() {
borderWidth={1}
borderRadius={4}
borderColor="gray.300"
bgColor="white"
p={6}
spacing={6}
>

View File

@@ -0,0 +1,48 @@
import { useState } from "react";
import { Text, VStack, Divider, HStack } from "@chakra-ui/react";
import AppShell from "~/components/nav/AppShell";
import LoggedCallTable from "~/components/requestLogs/LoggedCallsTable";
import LoggedCallsPaginator from "~/components/requestLogs/LoggedCallsPaginator";
import ActionButton from "~/components/requestLogs/ActionButton";
import { useAppStore } from "~/state/store";
import { RiFlaskLine } from "react-icons/ri";
import { FiFilter } from "react-icons/fi";
import LogFilters from "~/components/requestLogs/LogFilters/LogFilters";
export default function LoggedCalls() {
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const [filtersShown, setFiltersShown] = useState(true);
return (
<AppShell title="Request Logs" requireAuth>
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
<Text fontSize="2xl" fontWeight="bold">
Request Logs
</Text>
<Divider />
<HStack w="full" justifyContent="flex-end">
<ActionButton
onClick={() => {
setFiltersShown(!filtersShown);
}}
label={filtersShown ? "Hide Filters" : "Show Filters"}
icon={FiFilter}
/>
<ActionButton
onClick={() => {
console.log("experimenting with these ids", selectedLogIds);
}}
label="Experiment"
icon={RiFlaskLine}
isDisabled={selectedLogIds.size === 0}
/>
</HStack>
{filtersShown && <LogFilters />}
<LoggedCallTable />
<LoggedCallsPaginator />
</VStack>
</AppShell>
);
}

View File

@@ -4,11 +4,15 @@ import parserTypescript from "prettier/plugins/typescript";
// @ts-expect-error for some reason missing from types
import parserEstree from "prettier/plugins/estree";
// This emits a warning in the browser "Critical dependency: the request of a
// dependency is an expression". Unfortunately doesn't seem to be a way to get
// around it if we want to use Babel client-side for now. One solution would be
// to just do the formatting server-side in a trpc call.
// https://github.com/babel/babel/issues/14301
import * as babel from "@babel/standalone";
export function stripTypes(tsCode: string): string {
const options = {
presets: ["typescript"],
filename: "file.ts",
};

View File

@@ -0,0 +1,95 @@
import type { ApiKey, Project } from "@prisma/client";
import { TRPCError, initTRPC } from "@trpc/server";
import { type CreateNextContextOptions } from "@trpc/server/adapters/next";
import superjson from "superjson";
import { type OpenApiMeta } from "trpc-openapi";
import { ZodError } from "zod";
import { prisma } from "~/server/db";
type CreateContextOptions = {
key:
| (ApiKey & {
project: Project;
})
| null;
};
/**
* This helper generates the "internals" for a tRPC context. If you need to use it, you can export
* it from here.
*
* Examples of things you may need it for:
* - testing, so we don't have to mock Next.js' req/res
* - tRPC's `createSSGHelpers`, where we don't have req/res
*
* @see https://create.t3.gg/en/usage/trpc#-serverapitrpcts
*/
export const createInnerTRPCContext = (opts: CreateContextOptions) => {
return {
key: opts.key,
};
};
export const createOpenApiContext = async (opts: CreateNextContextOptions) => {
const { req, res } = opts;
const apiKey = req.headers.authorization?.split(" ")[1] as string | null;
if (!apiKey) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
const key = await prisma.apiKey.findUnique({
where: { apiKey },
include: { project: true },
});
if (!key) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
return createInnerTRPCContext({
key,
});
};
export type TRPCContext = Awaited<ReturnType<typeof createOpenApiContext>>;
const t = initTRPC
.context<typeof createOpenApiContext>()
.meta<OpenApiMeta>()
.create({
transformer: superjson,
errorFormatter({ shape, error }) {
return {
...shape,
data: {
...shape.data,
zodError: error.cause instanceof ZodError ? error.cause.flatten() : null,
},
};
},
});
export const createOpenApiRouter = t.router;
export const openApiPublicProc = t.procedure;
/** Reusable middleware that enforces users are logged in before running the procedure. */
const enforceApiKey = t.middleware(async ({ ctx, next }) => {
if (!ctx.key) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
return next({
ctx: { key: ctx.key },
});
});
/**
* Protected (authenticated) procedure
*
* If you want a query or mutation to ONLY be accessible to logged in users, use this. It verifies
* the session is valid and guarantees `ctx.session.user` is not null.
*
* @see https://trpc.io/docs/procedures
*/
export const openApiProtectedProc = t.procedure.use(enforceApiKey);

View File

@@ -2,9 +2,6 @@ import { type Prisma } from "@prisma/client";
import { type JsonValue } from "type-fest";
import { z } from "zod";
import { v4 as uuidv4 } from "uuid";
import { TRPCError } from "@trpc/server";
import { createTRPCRouter, publicProcedure } from "~/server/api/trpc";
import { prisma } from "~/server/db";
import { hashRequest } from "~/server/utils/hashObject";
import modelProvider from "~/modelProviders/openai-ChatCompletion";
@@ -12,6 +9,7 @@ import {
type ChatCompletion,
type CompletionCreateParams,
} from "openai/resources/chat/completions";
import { createOpenApiRouter, openApiProtectedProc } from "./openApiTrpc";
const reqValidator = z.object({
model: z.string(),
@@ -28,12 +26,12 @@ const respValidator = z.object({
),
});
export const externalApiRouter = createTRPCRouter({
checkCache: publicProcedure
export const v1ApiRouter = createOpenApiRouter({
checkCache: openApiProtectedProc
.meta({
openapi: {
method: "POST",
path: "/v1/check-cache",
path: "/check-cache",
description: "Check if a prompt is cached",
protect: true,
},
@@ -47,7 +45,8 @@ export const externalApiRouter = createTRPCRouter({
.optional()
.describe(
'Extra tags to attach to the call for filtering. Eg { "userId": "123", "promptId": "populate-title" }',
),
)
.default({}),
}),
)
.output(
@@ -56,18 +55,8 @@ export const externalApiRouter = createTRPCRouter({
}),
)
.mutation(async ({ input, ctx }) => {
const apiKey = ctx.apiKey;
if (!apiKey) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
const key = await prisma.apiKey.findUnique({
where: { apiKey },
});
if (!key) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
const reqPayload = await reqValidator.spa(input.reqPayload);
const cacheKey = hashRequest(key.projectId, reqPayload as JsonValue);
const cacheKey = hashRequest(ctx.key.projectId, reqPayload as JsonValue);
const existingResponse = await prisma.loggedCallModelResponse.findFirst({
where: { cacheKey },
@@ -79,23 +68,28 @@ export const externalApiRouter = createTRPCRouter({
await prisma.loggedCall.create({
data: {
projectId: key.projectId,
projectId: ctx.key.projectId,
requestedAt: new Date(input.requestedAt),
cacheHit: true,
modelResponseId: existingResponse.id,
},
});
await createTags(
existingResponse.originalLoggedCall.projectId,
existingResponse.originalLoggedCallId,
input.tags,
);
return {
respPayload: existingResponse.respPayload,
};
}),
report: publicProcedure
report: openApiProtectedProc
.meta({
openapi: {
method: "POST",
path: "/v1/report",
path: "/report",
description: "Report an API call",
protect: true,
},
@@ -113,45 +107,38 @@ export const externalApiRouter = createTRPCRouter({
.optional()
.describe(
'Extra tags to attach to the call for filtering. Eg { "userId": "123", "promptId": "populate-title" }',
),
)
.default({}),
}),
)
.output(z.void())
.output(z.object({ status: z.literal("ok") }))
.mutation(async ({ input, ctx }) => {
console.log("GOT TAGS", input.tags);
const apiKey = ctx.apiKey;
if (!apiKey) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
const key = await prisma.apiKey.findUnique({
where: { apiKey },
});
if (!key) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
const reqPayload = await reqValidator.spa(input.reqPayload);
const respPayload = await respValidator.spa(input.respPayload);
const requestHash = hashRequest(key.projectId, reqPayload as JsonValue);
const requestHash = hashRequest(ctx.key.projectId, reqPayload as JsonValue);
const newLoggedCallId = uuidv4();
const newModelResponseId = uuidv4();
let usage;
let model;
if (reqPayload.success && respPayload.success) {
usage = modelProvider.getUsage(
input.reqPayload as CompletionCreateParams,
input.respPayload as ChatCompletion,
);
model = reqPayload.data.model;
}
await prisma.$transaction([
prisma.loggedCall.create({
data: {
id: newLoggedCallId,
projectId: key.projectId,
projectId: ctx.key.projectId,
requestedAt: new Date(input.requestedAt),
cacheHit: false,
model,
},
}),
prisma.loggedCallModelResponse.create({
@@ -182,22 +169,78 @@ export const externalApiRouter = createTRPCRouter({
}),
]);
const tagsToCreate = Object.entries(input.tags ?? {}).map(([name, value]) => ({
loggedCallId: newLoggedCallId,
// sanitize tags
name: name.replaceAll(/[^a-zA-Z0-9_]/g, "_"),
value,
}));
if (reqPayload.success) {
tagsToCreate.push({
loggedCallId: newLoggedCallId,
name: "$model",
value: reqPayload.data.model,
});
await createTags(ctx.key.projectId, newLoggedCallId, input.tags);
return { status: "ok" };
}),
localTestingOnlyGetLatestLoggedCall: openApiProtectedProc
.meta({
openapi: {
method: "GET",
path: "/local-testing-only-get-latest-logged-call",
description: "Get the latest logged call (only for local testing)",
protect: true, // Make sure to protect this endpoint
},
})
.input(z.void())
.output(
z
.object({
createdAt: z.date(),
cacheHit: z.boolean(),
tags: z.record(z.string().nullable()),
modelResponse: z
.object({
id: z.string(),
statusCode: z.number().nullable(),
errorMessage: z.string().nullable(),
reqPayload: z.unknown(),
respPayload: z.unknown(),
})
.nullable(),
})
.nullable(),
)
.mutation(async ({ ctx }) => {
if (process.env.NODE_ENV === "production") {
throw new Error("This operation is not allowed in production environment");
}
await prisma.loggedCallTag.createMany({
data: tagsToCreate,
const latestLoggedCall = await prisma.loggedCall.findFirst({
where: { projectId: ctx.key.projectId },
orderBy: { requestedAt: "desc" },
select: {
createdAt: true,
cacheHit: true,
tags: true,
modelResponse: {
select: {
id: true,
statusCode: true,
errorMessage: true,
reqPayload: true,
respPayload: true,
},
},
},
});
return (
latestLoggedCall && {
...latestLoggedCall,
tags: Object.fromEntries(latestLoggedCall.tags.map((tag) => [tag.name, tag.value])),
}
);
}),
});
async function createTags(projectId: string, loggedCallId: string, tags: Record<string, string>) {
const tagsToCreate = Object.entries(tags).map(([name, value]) => ({
projectId,
loggedCallId,
name: name.replaceAll(/[^a-zA-Z0-9_$]/g, "_"),
value,
}));
await prisma.loggedCallTag.createMany({
data: tagsToCreate,
});
}

View File

@@ -8,9 +8,9 @@ import { evaluationsRouter } from "./routers/evaluations.router";
import { worldChampsRouter } from "./routers/worldChamps.router";
import { datasetsRouter } from "./routers/datasets.router";
import { datasetEntries } from "./routers/datasetEntries.router";
import { externalApiRouter } from "./routers/externalApi.router";
import { projectsRouter } from "./routers/projects.router";
import { dashboardRouter } from "./routers/dashboard.router";
import { loggedCallsRouter } from "./routers/loggedCalls.router";
/**
* This is the primary router for your server.
@@ -29,7 +29,7 @@ export const appRouter = createTRPCRouter({
datasetEntries: datasetEntries,
projects: projectsRouter,
dashboard: dashboardRouter,
externalApi: externalApiRouter,
loggedCalls: loggedCallsRouter,
});
// export type definition of API

View File

@@ -1,11 +1,12 @@
import { sql } from "kysely";
import { z } from "zod";
import { createTRPCRouter, publicProcedure } from "~/server/api/trpc";
import { kysely, prisma } from "~/server/db";
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
import { kysely } from "~/server/db";
import { requireCanViewProject } from "~/utils/accessControl";
import dayjs from "~/utils/dayjs";
export const dashboardRouter = createTRPCRouter({
stats: publicProcedure
stats: protectedProcedure
.input(
z.object({
// TODO: actually take startDate into account
@@ -13,7 +14,8 @@ export const dashboardRouter = createTRPCRouter({
projectId: z.string(),
}),
)
.query(async ({ input }) => {
.query(async ({ input, ctx }) => {
await requireCanViewProject(input.projectId, ctx);
// Return the stats group by hour
const periods = await kysely
.selectFrom("LoggedCall")
@@ -103,16 +105,4 @@ export const dashboardRouter = createTRPCRouter({
return { periods: backfilledPeriods, totals, errors: namedErrors };
}),
// TODO useInfiniteQuery
// https://discord.com/channels/966627436387266600/1122258443886153758/1122258443886153758
loggedCalls: publicProcedure.input(z.object({})).query(async ({ input }) => {
const loggedCalls = await prisma.loggedCall.findMany({
orderBy: { requestedAt: "desc" },
include: { tags: true, modelResponse: true },
take: 20,
});
return loggedCalls;
}),
});

View File

@@ -4,23 +4,21 @@ import { prisma } from "~/server/db";
import { requireCanModifyDataset, requireCanViewDataset } from "~/utils/accessControl";
import { autogenerateDatasetEntries } from "../autogenerate/autogenerateDatasetEntries";
const PAGE_SIZE = 10;
export const datasetEntries = createTRPCRouter({
list: protectedProcedure
.input(z.object({ datasetId: z.string(), page: z.number() }))
.input(z.object({ datasetId: z.string(), page: z.number(), pageSize: z.number() }))
.query(async ({ input, ctx }) => {
await requireCanViewDataset(input.datasetId, ctx);
const { datasetId, page } = input;
const { datasetId, page, pageSize } = input;
const entries = await prisma.datasetEntry.findMany({
where: {
datasetId,
},
orderBy: { createdAt: "desc" },
skip: (page - 1) * PAGE_SIZE,
take: PAGE_SIZE,
skip: (page - 1) * pageSize,
take: pageSize,
});
const count = await prisma.datasetEntry.count({
@@ -31,8 +29,6 @@ export const datasetEntries = createTRPCRouter({
return {
entries,
startIndex: (page - 1) * PAGE_SIZE + 1,
lastPage: Math.ceil(count / PAGE_SIZE),
count,
};
}),

View File

@@ -0,0 +1,197 @@
import { z } from "zod";
import { type Expression, type SqlBool, sql } from "kysely";
import { jsonArrayFrom } from "kysely/helpers/postgres";
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
import { kysely, prisma } from "~/server/db";
import { comparators, defaultFilterableFields } from "~/state/logFiltersSlice";
import { requireCanViewProject } from "~/utils/accessControl";
// create comparator type based off of comparators
const comparatorToSqlValue = (comparator: (typeof comparators)[number], value: string) => {
switch (comparator) {
case "=":
return `= '${value}'`;
case "!=":
return `!= '${value}'`;
case "CONTAINS":
return `like '%${value}%'`;
}
};
export const loggedCallsRouter = createTRPCRouter({
list: protectedProcedure
.input(
z.object({
projectId: z.string(),
page: z.number(),
pageSize: z.number(),
filters: z.array(
z.object({
field: z.string(),
comparator: z.enum(comparators),
value: z.string().optional(),
}),
),
}),
)
.query(async ({ input, ctx }) => {
const { projectId, page, pageSize } = input;
await requireCanViewProject(projectId, ctx);
const baseQuery = kysely
.selectFrom("LoggedCall as lc")
.leftJoin("LoggedCallModelResponse as lcmr", "lc.id", "lcmr.originalLoggedCallId")
.where((eb) => {
const wheres: Expression<SqlBool>[] = [eb("lc.projectId", "=", projectId)];
for (const filter of input.filters) {
if (!filter.value) continue;
if (filter.field === "Request") {
wheres.push(
sql.raw(
`lcmr."reqPayload"::text ${comparatorToSqlValue(
filter.comparator,
filter.value,
)}`,
),
);
}
if (filter.field === "Response") {
wheres.push(
sql.raw(
`lcmr."respPayload"::text ${comparatorToSqlValue(
filter.comparator,
filter.value,
)}`,
),
);
}
if (filter.field === "Model") {
wheres.push(
sql.raw(`lc."model" ${comparatorToSqlValue(filter.comparator, filter.value)}`),
);
}
if (filter.field === "Status Code") {
wheres.push(
sql.raw(
`lcmr."statusCode"::text ${comparatorToSqlValue(
filter.comparator,
filter.value,
)}`,
),
);
}
}
return eb.and(wheres);
});
const tagFilters = input.filters.filter(
(filter) =>
!defaultFilterableFields.includes(
filter.field as (typeof defaultFilterableFields)[number],
),
);
let updatedBaseQuery = baseQuery;
for (let i = 0; i < tagFilters.length; i++) {
const filter = tagFilters[i];
if (!filter?.value) continue;
const tableAlias = `lct${i}`;
updatedBaseQuery = updatedBaseQuery
.leftJoin(`LoggedCallTag as ${tableAlias}`, (join) =>
join
.onRef("lc.id", "=", `${tableAlias}.loggedCallId`)
.on(`${tableAlias}.name`, "=", filter.field),
)
.where(
sql.raw(`${tableAlias}.value ${comparatorToSqlValue(filter.comparator, filter.value)}`),
) as unknown as typeof baseQuery;
}
const rawCalls = await updatedBaseQuery
.select((eb) => [
"lc.id as id",
"lc.requestedAt as requestedAt",
"model",
"cacheHit",
"lc.requestedAt",
"receivedAt",
"reqPayload",
"respPayload",
"model",
"inputTokens",
"outputTokens",
"cost",
"statusCode",
"durationMs",
jsonArrayFrom(
eb
.selectFrom("LoggedCallTag")
.select(["name", "value"])
.whereRef("loggedCallId", "=", "lc.id"),
).as("tags"),
])
.orderBy("lc.requestedAt", "desc")
.limit(pageSize)
.offset((page - 1) * pageSize)
.execute();
const calls = rawCalls.map((rawCall) => {
const tagsObject = rawCall.tags.reduce(
(acc, tag) => {
acc[tag.name] = tag.value;
return acc;
},
{} as Record<string, string | null>,
);
return {
id: rawCall.id,
requestedAt: rawCall.requestedAt,
model: rawCall.model,
cacheHit: rawCall.cacheHit,
modelResponse: {
receivedAt: rawCall.receivedAt,
reqPayload: rawCall.reqPayload,
respPayload: rawCall.respPayload,
inputTokens: rawCall.inputTokens,
outputTokens: rawCall.outputTokens,
cost: rawCall.cost,
statusCode: rawCall.statusCode,
durationMs: rawCall.durationMs,
},
tags: tagsObject,
};
});
const matchingLogIds = await updatedBaseQuery.select(["lc.id"]).execute();
const count = matchingLogIds.length;
return { calls, count, matchingLogIds: matchingLogIds.map((log) => log.id) };
}),
getTagNames: protectedProcedure
.input(z.object({ projectId: z.string() }))
.query(async ({ input, ctx }) => {
await requireCanViewProject(input.projectId, ctx);
const tags = await prisma.loggedCallTag.findMany({
distinct: ["name"],
where: {
projectId: input.projectId,
},
select: {
name: true,
},
orderBy: {
name: "asc",
},
});
return tags.map((tag) => tag.name);
}),
});

View File

@@ -131,6 +131,8 @@ export const promptVariantsRouter = createTRPCRouter({
const inputTokens = overallTokens._sum?.inputTokens ?? 0;
const outputTokens = overallTokens._sum?.outputTokens ?? 0;
const awaitingCompletions = outputCount < scenarioCount;
const awaitingEvals = !!evalResults.find(
(result) => result.totalCount < scenarioCount * evals.length,
);
@@ -142,6 +144,7 @@ export const promptVariantsRouter = createTRPCRouter({
overallCost: overallTokens._sum?.cost ?? 0,
scenarioCount,
outputCount,
awaitingCompletions,
awaitingEvals,
};
}),

View File

@@ -7,15 +7,13 @@ import { runAllEvals } from "~/server/utils/evaluations";
import { generateNewCell } from "~/server/utils/generateNewCell";
import { requireCanModifyExperiment, requireCanViewExperiment } from "~/utils/accessControl";
const PAGE_SIZE = 10;
export const scenariosRouter = createTRPCRouter({
list: publicProcedure
.input(z.object({ experimentId: z.string(), page: z.number() }))
.input(z.object({ experimentId: z.string(), page: z.number(), pageSize: z.number() }))
.query(async ({ input, ctx }) => {
await requireCanViewExperiment(input.experimentId, ctx);
const { experimentId, page } = input;
const { experimentId, page, pageSize } = input;
const scenarios = await prisma.testScenario.findMany({
where: {
@@ -23,8 +21,8 @@ export const scenariosRouter = createTRPCRouter({
visible: true,
},
orderBy: { sortIndex: "asc" },
skip: (page - 1) * PAGE_SIZE,
take: PAGE_SIZE,
skip: (page - 1) * pageSize,
take: pageSize,
});
const count = await prisma.testScenario.count({
@@ -36,8 +34,6 @@ export const scenariosRouter = createTRPCRouter({
return {
scenarios,
startIndex: (page - 1) * PAGE_SIZE + 1,
lastPage: Math.ceil(count / PAGE_SIZE),
count,
};
}),

View File

@@ -27,7 +27,6 @@ import { capturePath } from "~/utils/analytics/serverAnalytics";
type CreateContextOptions = {
session: Session | null;
apiKey: string | null;
};
// eslint-disable-next-line @typescript-eslint/no-empty-function
@@ -46,7 +45,6 @@ const noOp = () => {};
export const createInnerTRPCContext = (opts: CreateContextOptions) => {
return {
session: opts.session,
apiKey: opts.apiKey,
prisma,
markAccessControlRun: noOp,
};
@@ -64,11 +62,8 @@ export const createTRPCContext = async (opts: CreateNextContextOptions) => {
// Get the session from the server using the getServerSession wrapper function
const session = await getServerAuthSession({ req, res });
const apiKey = req.headers.authorization?.split(" ")[1] as string | null;
return createInnerTRPCContext({
session,
apiKey,
});
};

View File

@@ -1,32 +1,41 @@
import "dotenv/config";
import { openApiDocument } from "~/pages/api/openapi.json";
import { openApiDocument } from "~/pages/api/v1/openapi.json";
import fs from "fs";
import path from "path";
import { execSync } from "child_process";
console.log("Exporting public OpenAPI schema to client-libs/schema.json");
import { generate } from "openapi-typescript-codegen";
const scriptPath = import.meta.url.replace("file://", "");
const clientLibsPath = path.join(path.dirname(scriptPath), "../../../../client-libs");
const schemaPath = path.join(clientLibsPath, "schema.json");
const schemaPath = path.join(clientLibsPath, "openapi.json");
console.log(`Exporting public OpenAPI schema to ${schemaPath}`);
console.log("Exporting schema");
fs.writeFileSync(schemaPath, JSON.stringify(openApiDocument, null, 2), "utf-8");
console.log("Generating Typescript client");
console.log("Generating TypeScript client");
const tsClientPath = path.join(clientLibsPath, "typescript/codegen");
const tsClientPath = path.join(clientLibsPath, "typescript/src/codegen");
fs.rmSync(tsClientPath, { recursive: true, force: true });
fs.mkdirSync(tsClientPath, { recursive: true });
execSync(
`pnpm dlx @openapitools/openapi-generator-cli generate -i "${schemaPath}" -g typescript-axios -o "${tsClientPath}"`,
{
stdio: "inherit",
},
);
await generate({
input: openApiDocument,
output: tsClientPath,
clientName: "OPClient",
httpClient: "node",
});
// execSync(
// `pnpm run openapi generate --input "${schemaPath}" --output "${tsClientPath}" --name OPClient --client node`,
// {
// stdio: "inherit",
// },
// );
console.log("Generating Python client");
execSync(path.join(clientLibsPath, "python/codegen.sh"));
console.log("Done!");
process.exit(0);

View File

@@ -51,7 +51,7 @@ const requestUpdatedPromptFunction = async (
originalModelProvider.inputSchema,
null,
2,
)}\n\nDo not add any assistant messages.`,
)}`,
},
{
role: "user",

View File

@@ -2,4 +2,4 @@ import cryptoRandomString from "crypto-random-string";
const KEY_LENGTH = 42;
export const generateApiKey = () => `opc_${cryptoRandomString({ length: KEY_LENGTH })}`;
export const generateApiKey = () => `opk_${cryptoRandomString({ length: KEY_LENGTH })}`;

View File

@@ -1,13 +1,29 @@
import fs from "fs";
import path from "path";
import OpenAI, { type ClientOptions } from "openpipe/src/openai";
import { env } from "~/env.mjs";
import { default as OriginalOpenAI } from "openai";
// import { OpenAI } from "openpipe";
let config: ClientOptions;
const openAIConfig = { apiKey: env.OPENAI_API_KEY ?? "dummy-key" };
try {
// Allow developers to override the config with a local file
const jsonData = fs.readFileSync(
path.join(path.dirname(import.meta.url).replace("file://", ""), "./openaiCustomConfig.json"),
"utf8",
);
config = JSON.parse(jsonData.toString());
} catch (error) {
// Set a dummy key so it doesn't fail at build time
config = {
apiKey: env.OPENAI_API_KEY ?? "dummy-key",
openpipe: {
apiKey: env.OPENPIPE_API_KEY,
baseUrl: "http://localhost:3000/api/v1",
},
};
}
// Set a dummy key so it doesn't fail at build time
// export const openai = env.OPENPIPE_API_KEY
// ? new OpenAI.OpenAI(openAIConfig)
// : new OriginalOpenAI(openAIConfig);
// export const openai = env.OPENPIPE_API_KEY ? new OpenAI.OpenAI(config) : new OriginalOpenAI(config);
export const openai = new OriginalOpenAI(openAIConfig);
export const openai = new OpenAI(config);

View File

@@ -0,0 +1,39 @@
import { type SliceCreator } from "./store";
export const comparators = ["=", "!=", "CONTAINS"] as const;
export const defaultFilterableFields = ["Request", "Response", "Model", "Status Code"] as const;
export interface LogFilter {
field: string;
comparator: (typeof comparators)[number];
value?: string;
}
export type LogFiltersSlice = {
filters: LogFilter[];
addFilter: (filter: LogFilter) => void;
updateFilter: (index: number, filter: LogFilter) => void;
deleteFilter: (index: number) => void;
clearSelectedLogIds: () => void;
};
export const createLogFiltersSlice: SliceCreator<LogFiltersSlice> = (set, get) => ({
filters: [],
addFilter: (filter: LogFilter) =>
set((state) => {
state.logFilters.filters.push(filter);
}),
updateFilter: (index: number, filter: LogFilter) =>
set((state) => {
state.logFilters.filters[index] = filter;
}),
deleteFilter: (index: number) =>
set((state) => {
state.logFilters.filters.splice(index, 1);
}),
clearSelectedLogIds: () =>
set((state) => {
state.logFilters.filters = [];
}),
});

View File

@@ -0,0 +1,28 @@
import { type SliceCreator } from "./store";
export type SelectedLogsSlice = {
selectedLogIds: Set<string>;
toggleSelectedLogId: (id: string) => void;
addSelectedLogIds: (ids: string[]) => void;
clearSelectedLogIds: () => void;
};
export const createSelectedLogsSlice: SliceCreator<SelectedLogsSlice> = (set, get) => ({
selectedLogIds: new Set(),
toggleSelectedLogId: (id: string) =>
set((state) => {
if (state.selectedLogs.selectedLogIds.has(id)) {
state.selectedLogs.selectedLogIds.delete(id);
} else {
state.selectedLogs.selectedLogIds.add(id);
}
}),
addSelectedLogIds: (ids: string[]) =>
set((state) => {
state.selectedLogs.selectedLogIds = new Set([...state.selectedLogs.selectedLogIds, ...ids]);
}),
clearSelectedLogIds: () =>
set((state) => {
state.selectedLogs.selectedLogIds = new Set();
}),
});

View File

@@ -81,8 +81,6 @@ export const createVariantEditorSlice: SliceCreator<SharedVariantEditorSlice> =
};
`;
console.log(modelContents);
const scenariosModel = monaco.editor.getModel(monaco.Uri.parse("file:///scenarios.ts"));
if (scenariosModel) {

View File

@@ -1,5 +1,6 @@
import { type StateCreator, create } from "zustand";
import { immer } from "zustand/middleware/immer";
import { enableMapSet } from "immer";
import { persist } from "zustand/middleware";
import { createSelectors } from "./createSelectors";
import {
@@ -8,6 +9,10 @@ import {
} from "./sharedVariantEditor.slice";
import { type APIClient } from "~/utils/api";
import { persistOptions, type stateToPersist } from "./persist";
import { type SelectedLogsSlice, createSelectedLogsSlice } from "./selectedLogsSlice";
import { type LogFiltersSlice, createLogFiltersSlice } from "./logFiltersSlice";
enableMapSet();
export type State = {
drawerOpen: boolean;
@@ -17,7 +22,9 @@ export type State = {
setApi: (api: APIClient) => void;
sharedVariantEditor: SharedVariantEditorSlice;
selectedProjectId: string | null;
setselectedProjectId: (id: string) => void;
setSelectedProjectId: (id: string) => void;
selectedLogs: SelectedLogsSlice;
logFilters: LogFiltersSlice;
};
export type SliceCreator<T> = StateCreator<State, [["zustand/immer", never]], [], T>;
@@ -48,10 +55,12 @@ const useBaseStore = create<
}),
sharedVariantEditor: createVariantEditorSlice(set, get, ...rest),
selectedProjectId: null,
setselectedProjectId: (id: string) =>
setSelectedProjectId: (id: string) =>
set((state) => {
state.selectedProjectId = id;
}),
selectedLogs: createSelectedLogsSlice(set, get, ...rest),
logFilters: createLogFiltersSlice(set, get, ...rest),
})),
persistOptions,
),

View File

@@ -45,7 +45,7 @@ const theme = extendTheme({
components: {
Button: {
baseStyle: {
borderRadius: "sm",
borderRadius: "md",
},
},
Input: {

View File

@@ -1,31 +0,0 @@
import { type Session } from "next-auth";
import { useSession } from "next-auth/react";
import { useEffect } from "react";
import posthog from "posthog-js";
import { env } from "~/env.mjs";
// Make sure we're in the browser
const enableBrowserAnalytics = typeof window !== "undefined";
if (env.NEXT_PUBLIC_POSTHOG_KEY && enableBrowserAnalytics) {
posthog.init(env.NEXT_PUBLIC_POSTHOG_KEY, {
api_host: `${env.NEXT_PUBLIC_HOST}/ingest`,
});
}
export const identifySession = (session: Session) => {
if (!session.user) return;
posthog.identify(session.user.id, {
name: session.user.name,
email: session.user.email,
});
};
export const SessionIdentifier = () => {
const session = useSession().data;
useEffect(() => {
if (session && enableBrowserAnalytics) identifySession(session);
}, [session]);
return null;
};

View File

@@ -0,0 +1,41 @@
"use client";
import { useSession } from "next-auth/react";
import React, { type ReactNode, useEffect } from "react";
import { PostHogProvider } from "posthog-js/react";
import posthog from "posthog-js";
import { env } from "~/env.mjs";
import { useRouter } from "next/router";
// Make sure we're in the browser
const inBrowser = typeof window !== "undefined";
export const PosthogAppProvider = ({ children }: { children: ReactNode }) => {
const session = useSession().data;
const router = useRouter();
useEffect(() => {
// Track page views
const handleRouteChange = () => posthog?.capture("$pageview");
router.events.on("routeChangeComplete", handleRouteChange);
return () => {
router.events.off("routeChangeComplete", handleRouteChange);
};
}, [router.events]);
useEffect(() => {
if (env.NEXT_PUBLIC_POSTHOG_KEY && inBrowser && session && session.user) {
posthog.init(env.NEXT_PUBLIC_POSTHOG_KEY, {
api_host: `${env.NEXT_PUBLIC_HOST}/ingest`,
});
posthog.identify(session.user.id, {
name: session.user.name,
email: session.user.email,
});
}
}, [session]);
return <PostHogProvider client={posthog}>{children}</PostHogProvider>;
};

View File

@@ -1,7 +1,7 @@
import { useRouter } from "next/router";
import { type RefObject, useCallback, useEffect, useRef, useState } from "react";
import { api } from "~/utils/api";
import { NumberParam, useQueryParam, withDefault } from "use-query-params";
import { NumberParam, useQueryParams } from "use-query-params";
import { useAppStore } from "~/state/store";
export const useExperiments = () => {
@@ -46,10 +46,10 @@ export const useDataset = () => {
export const useDatasetEntries = () => {
const dataset = useDataset();
const [page] = usePage();
const { page, pageSize } = usePageParams();
return api.datasetEntries.list.useQuery(
{ datasetId: dataset.data?.id ?? "", page },
{ datasetId: dataset.data?.id ?? "", page, pageSize },
{ enabled: dataset.data?.id != null },
);
};
@@ -132,14 +132,23 @@ export const useElementDimensions = (): [RefObject<HTMLElement>, Dimensions | un
return [ref, dimensions];
};
export const usePage = () => useQueryParam("page", withDefault(NumberParam, 1));
export const usePageParams = () => {
const [pageParams, setPageParams] = useQueryParams({
page: NumberParam,
pageSize: NumberParam,
});
const { page, pageSize } = pageParams;
return { page: page || 1, pageSize: pageSize || 10, setPageParams };
};
export const useScenarios = () => {
const experiment = useExperiment();
const [page] = usePage();
const { page, pageSize } = usePageParams();
return api.scenarios.list.useQuery(
{ experimentId: experiment.data?.id ?? "", page },
{ experimentId: experiment.data?.id ?? "", page, pageSize },
{ enabled: experiment.data?.id != null },
);
};
@@ -166,3 +175,33 @@ export const useScenarioVars = () => {
{ enabled: experiment.data?.id != null },
);
};
export const useLoggedCalls = () => {
const selectedProjectId = useAppStore((state) => state.selectedProjectId);
const { page, pageSize } = usePageParams();
const filters = useAppStore((state) => state.logFilters.filters);
const { data, isLoading, ...rest } = api.loggedCalls.list.useQuery(
{ projectId: selectedProjectId ?? "", page, pageSize, filters },
{ enabled: !!selectedProjectId },
);
const [stableData, setStableData] = useState(data);
useEffect(() => {
// Prevent annoying flashes while logs are loading from the server
if (!isLoading) {
setStableData(data);
}
}, [data, isLoading]);
return { data: stableData, isLoading, ...rest };
};
export const useTagNames = () => {
const selectedProjectId = useAppStore((state) => state.selectedProjectId);
return api.loggedCalls.getTagNames.useQuery(
{ projectId: selectedProjectId ?? "" },
{ enabled: !!selectedProjectId },
);
};

9
app/test-docker.sh Executable file
View File

@@ -0,0 +1,9 @@
#! /bin/bash
set -e
cd "$(dirname "$0")/.."
source app/.env
docker build . --file app/Dockerfile

View File

@@ -25,11 +25,11 @@
".eslintrc.cjs",
"next-env.d.ts",
"**/*.ts",
"**/*.mts",
"**/*.tsx",
"**/*.cjs",
"**/*.mjs",
"**/*.js",
"src/pages/api/sentry-example-api.js"
"**/*.js"
],
"exclude": ["node_modules"]
}

289
client-libs/openapi.json Normal file
View File

@@ -0,0 +1,289 @@
{
"openapi": "3.0.3",
"info": {
"title": "OpenPipe API",
"description": "The public API for reporting API calls to OpenPipe",
"version": "0.1.1"
},
"servers": [
{
"url": "https://app.openpipe.ai/api/v1"
}
],
"paths": {
"/check-cache": {
"post": {
"operationId": "checkCache",
"description": "Check if a prompt is cached",
"security": [
{
"Authorization": []
}
],
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"requestedAt": {
"type": "number",
"description": "Unix timestamp in milliseconds"
},
"reqPayload": {
"description": "JSON-encoded request payload"
},
"tags": {
"type": "object",
"additionalProperties": {
"type": "string"
},
"description": "Extra tags to attach to the call for filtering. Eg { \"userId\": \"123\", \"promptId\": \"populate-title\" }",
"default": {}
}
},
"required": [
"requestedAt"
],
"additionalProperties": false
}
}
}
},
"parameters": [],
"responses": {
"200": {
"description": "Successful response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"respPayload": {
"description": "JSON-encoded response payload"
}
},
"additionalProperties": false
}
}
}
},
"default": {
"$ref": "#/components/responses/error"
}
}
}
},
"/report": {
"post": {
"operationId": "report",
"description": "Report an API call",
"security": [
{
"Authorization": []
}
],
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"requestedAt": {
"type": "number",
"description": "Unix timestamp in milliseconds"
},
"receivedAt": {
"type": "number",
"description": "Unix timestamp in milliseconds"
},
"reqPayload": {
"description": "JSON-encoded request payload"
},
"respPayload": {
"description": "JSON-encoded response payload"
},
"statusCode": {
"type": "number",
"description": "HTTP status code of response"
},
"errorMessage": {
"type": "string",
"description": "User-friendly error message"
},
"tags": {
"type": "object",
"additionalProperties": {
"type": "string"
},
"description": "Extra tags to attach to the call for filtering. Eg { \"userId\": \"123\", \"promptId\": \"populate-title\" }",
"default": {}
}
},
"required": [
"requestedAt",
"receivedAt"
],
"additionalProperties": false
}
}
}
},
"parameters": [],
"responses": {
"200": {
"description": "Successful response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"status": {
"type": "string",
"enum": [
"ok"
]
}
},
"required": [
"status"
],
"additionalProperties": false
}
}
}
},
"default": {
"$ref": "#/components/responses/error"
}
}
}
},
"/local-testing-only-get-latest-logged-call": {
"get": {
"operationId": "localTestingOnlyGetLatestLoggedCall",
"description": "Get the latest logged call (only for local testing)",
"security": [
{
"Authorization": []
}
],
"parameters": [],
"responses": {
"200": {
"description": "Successful response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"createdAt": {
"type": "string",
"format": "date-time"
},
"cacheHit": {
"type": "boolean"
},
"tags": {
"type": "object",
"additionalProperties": {
"type": "string",
"nullable": true
}
},
"modelResponse": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"statusCode": {
"type": "number",
"nullable": true
},
"errorMessage": {
"type": "string",
"nullable": true
},
"reqPayload": {},
"respPayload": {}
},
"required": [
"id",
"statusCode",
"errorMessage"
],
"additionalProperties": false,
"nullable": true
}
},
"required": [
"createdAt",
"cacheHit",
"tags",
"modelResponse"
],
"additionalProperties": false,
"nullable": true
}
}
}
},
"default": {
"$ref": "#/components/responses/error"
}
}
}
}
},
"components": {
"securitySchemes": {
"Authorization": {
"type": "http",
"scheme": "bearer"
}
},
"responses": {
"error": {
"description": "Error response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"message": {
"type": "string"
},
"code": {
"type": "string"
},
"issues": {
"type": "array",
"items": {
"type": "object",
"properties": {
"message": {
"type": "string"
}
},
"required": [
"message"
],
"additionalProperties": false
}
}
},
"required": [
"message",
"code"
],
"additionalProperties": false
}
}
}
}
}
}
}

View File

@@ -4,7 +4,7 @@ set -e
cd "$(dirname "$0")"
poetry run openapi-python-client generate --url http://localhost:3000/api/openapi.json
poetry run openapi-python-client generate --path ../openapi.json
rm -rf openpipe/api_client
mv open-pipe-api-client/open_pipe_api_client openpipe/api_client

View File

@@ -5,14 +5,14 @@ import httpx
from ... import errors
from ...client import AuthenticatedClient, Client
from ...models.external_api_check_cache_json_body import ExternalApiCheckCacheJsonBody
from ...models.external_api_check_cache_response_200 import ExternalApiCheckCacheResponse200
from ...models.check_cache_json_body import CheckCacheJsonBody
from ...models.check_cache_response_200 import CheckCacheResponse200
from ...types import Response
def _get_kwargs(
*,
json_body: ExternalApiCheckCacheJsonBody,
json_body: CheckCacheJsonBody,
) -> Dict[str, Any]:
pass
@@ -20,16 +20,16 @@ def _get_kwargs(
return {
"method": "post",
"url": "/v1/check-cache",
"url": "/check-cache",
"json": json_json_body,
}
def _parse_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Optional[ExternalApiCheckCacheResponse200]:
) -> Optional[CheckCacheResponse200]:
if response.status_code == HTTPStatus.OK:
response_200 = ExternalApiCheckCacheResponse200.from_dict(response.json())
response_200 = CheckCacheResponse200.from_dict(response.json())
return response_200
if client.raise_on_unexpected_status:
@@ -40,7 +40,7 @@ def _parse_response(
def _build_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Response[ExternalApiCheckCacheResponse200]:
) -> Response[CheckCacheResponse200]:
return Response(
status_code=HTTPStatus(response.status_code),
content=response.content,
@@ -52,19 +52,19 @@ def _build_response(
def sync_detailed(
*,
client: AuthenticatedClient,
json_body: ExternalApiCheckCacheJsonBody,
) -> Response[ExternalApiCheckCacheResponse200]:
json_body: CheckCacheJsonBody,
) -> Response[CheckCacheResponse200]:
"""Check if a prompt is cached
Args:
json_body (ExternalApiCheckCacheJsonBody):
json_body (CheckCacheJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[ExternalApiCheckCacheResponse200]
Response[CheckCacheResponse200]
"""
kwargs = _get_kwargs(
@@ -81,19 +81,19 @@ def sync_detailed(
def sync(
*,
client: AuthenticatedClient,
json_body: ExternalApiCheckCacheJsonBody,
) -> Optional[ExternalApiCheckCacheResponse200]:
json_body: CheckCacheJsonBody,
) -> Optional[CheckCacheResponse200]:
"""Check if a prompt is cached
Args:
json_body (ExternalApiCheckCacheJsonBody):
json_body (CheckCacheJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
ExternalApiCheckCacheResponse200
CheckCacheResponse200
"""
return sync_detailed(
@@ -105,19 +105,19 @@ def sync(
async def asyncio_detailed(
*,
client: AuthenticatedClient,
json_body: ExternalApiCheckCacheJsonBody,
) -> Response[ExternalApiCheckCacheResponse200]:
json_body: CheckCacheJsonBody,
) -> Response[CheckCacheResponse200]:
"""Check if a prompt is cached
Args:
json_body (ExternalApiCheckCacheJsonBody):
json_body (CheckCacheJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[ExternalApiCheckCacheResponse200]
Response[CheckCacheResponse200]
"""
kwargs = _get_kwargs(
@@ -132,19 +132,19 @@ async def asyncio_detailed(
async def asyncio(
*,
client: AuthenticatedClient,
json_body: ExternalApiCheckCacheJsonBody,
) -> Optional[ExternalApiCheckCacheResponse200]:
json_body: CheckCacheJsonBody,
) -> Optional[CheckCacheResponse200]:
"""Check if a prompt is cached
Args:
json_body (ExternalApiCheckCacheJsonBody):
json_body (CheckCacheJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
ExternalApiCheckCacheResponse200
CheckCacheResponse200
"""
return (

View File

@@ -1,98 +0,0 @@
from http import HTTPStatus
from typing import Any, Dict, Optional, Union
import httpx
from ... import errors
from ...client import AuthenticatedClient, Client
from ...models.external_api_report_json_body import ExternalApiReportJsonBody
from ...types import Response
def _get_kwargs(
*,
json_body: ExternalApiReportJsonBody,
) -> Dict[str, Any]:
pass
json_json_body = json_body.to_dict()
return {
"method": "post",
"url": "/v1/report",
"json": json_json_body,
}
def _parse_response(*, client: Union[AuthenticatedClient, Client], response: httpx.Response) -> Optional[Any]:
if response.status_code == HTTPStatus.OK:
return None
if client.raise_on_unexpected_status:
raise errors.UnexpectedStatus(response.status_code, response.content)
else:
return None
def _build_response(*, client: Union[AuthenticatedClient, Client], response: httpx.Response) -> Response[Any]:
return Response(
status_code=HTTPStatus(response.status_code),
content=response.content,
headers=response.headers,
parsed=_parse_response(client=client, response=response),
)
def sync_detailed(
*,
client: AuthenticatedClient,
json_body: ExternalApiReportJsonBody,
) -> Response[Any]:
"""Report an API call
Args:
json_body (ExternalApiReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[Any]
"""
kwargs = _get_kwargs(
json_body=json_body,
)
response = client.get_httpx_client().request(
**kwargs,
)
return _build_response(client=client, response=response)
async def asyncio_detailed(
*,
client: AuthenticatedClient,
json_body: ExternalApiReportJsonBody,
) -> Response[Any]:
"""Report an API call
Args:
json_body (ExternalApiReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[Any]
"""
kwargs = _get_kwargs(
json_body=json_body,
)
response = await client.get_async_httpx_client().request(**kwargs)
return _build_response(client=client, response=response)

View File

@@ -0,0 +1,133 @@
from http import HTTPStatus
from typing import Any, Dict, Optional, Union
import httpx
from ... import errors
from ...client import AuthenticatedClient, Client
from ...models.local_testing_only_get_latest_logged_call_response_200 import (
LocalTestingOnlyGetLatestLoggedCallResponse200,
)
from ...types import Response
def _get_kwargs() -> Dict[str, Any]:
pass
return {
"method": "get",
"url": "/local-testing-only-get-latest-logged-call",
}
def _parse_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Optional[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
if response.status_code == HTTPStatus.OK:
_response_200 = response.json()
response_200: Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]
if _response_200 is None:
response_200 = None
else:
response_200 = LocalTestingOnlyGetLatestLoggedCallResponse200.from_dict(_response_200)
return response_200
if client.raise_on_unexpected_status:
raise errors.UnexpectedStatus(response.status_code, response.content)
else:
return None
def _build_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Response[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
return Response(
status_code=HTTPStatus(response.status_code),
content=response.content,
headers=response.headers,
parsed=_parse_response(client=client, response=response),
)
def sync_detailed(
*,
client: AuthenticatedClient,
) -> Response[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
"""Get the latest logged call (only for local testing)
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]
"""
kwargs = _get_kwargs()
response = client.get_httpx_client().request(
**kwargs,
)
return _build_response(client=client, response=response)
def sync(
*,
client: AuthenticatedClient,
) -> Optional[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
"""Get the latest logged call (only for local testing)
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]
"""
return sync_detailed(
client=client,
).parsed
async def asyncio_detailed(
*,
client: AuthenticatedClient,
) -> Response[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
"""Get the latest logged call (only for local testing)
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]
"""
kwargs = _get_kwargs()
response = await client.get_async_httpx_client().request(**kwargs)
return _build_response(client=client, response=response)
async def asyncio(
*,
client: AuthenticatedClient,
) -> Optional[Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]]:
"""Get the latest logged call (only for local testing)
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Optional[LocalTestingOnlyGetLatestLoggedCallResponse200]
"""
return (
await asyncio_detailed(
client=client,
)
).parsed

View File

@@ -0,0 +1,155 @@
from http import HTTPStatus
from typing import Any, Dict, Optional, Union
import httpx
from ... import errors
from ...client import AuthenticatedClient, Client
from ...models.report_json_body import ReportJsonBody
from ...models.report_response_200 import ReportResponse200
from ...types import Response
def _get_kwargs(
*,
json_body: ReportJsonBody,
) -> Dict[str, Any]:
pass
json_json_body = json_body.to_dict()
return {
"method": "post",
"url": "/report",
"json": json_json_body,
}
def _parse_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Optional[ReportResponse200]:
if response.status_code == HTTPStatus.OK:
response_200 = ReportResponse200.from_dict(response.json())
return response_200
if client.raise_on_unexpected_status:
raise errors.UnexpectedStatus(response.status_code, response.content)
else:
return None
def _build_response(
*, client: Union[AuthenticatedClient, Client], response: httpx.Response
) -> Response[ReportResponse200]:
return Response(
status_code=HTTPStatus(response.status_code),
content=response.content,
headers=response.headers,
parsed=_parse_response(client=client, response=response),
)
def sync_detailed(
*,
client: AuthenticatedClient,
json_body: ReportJsonBody,
) -> Response[ReportResponse200]:
"""Report an API call
Args:
json_body (ReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[ReportResponse200]
"""
kwargs = _get_kwargs(
json_body=json_body,
)
response = client.get_httpx_client().request(
**kwargs,
)
return _build_response(client=client, response=response)
def sync(
*,
client: AuthenticatedClient,
json_body: ReportJsonBody,
) -> Optional[ReportResponse200]:
"""Report an API call
Args:
json_body (ReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
ReportResponse200
"""
return sync_detailed(
client=client,
json_body=json_body,
).parsed
async def asyncio_detailed(
*,
client: AuthenticatedClient,
json_body: ReportJsonBody,
) -> Response[ReportResponse200]:
"""Report an API call
Args:
json_body (ReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
Response[ReportResponse200]
"""
kwargs = _get_kwargs(
json_body=json_body,
)
response = await client.get_async_httpx_client().request(**kwargs)
return _build_response(client=client, response=response)
async def asyncio(
*,
client: AuthenticatedClient,
json_body: ReportJsonBody,
) -> Optional[ReportResponse200]:
"""Report an API call
Args:
json_body (ReportJsonBody):
Raises:
errors.UnexpectedStatus: If the server returns an undocumented status code and Client.raise_on_unexpected_status is True.
httpx.TimeoutException: If the request takes longer than Client.timeout.
Returns:
ReportResponse200
"""
return (
await asyncio_detailed(
client=client,
json_body=json_body,
)
).parsed

View File

@@ -1,15 +1,29 @@
""" Contains all the data models used in inputs/outputs """
from .external_api_check_cache_json_body import ExternalApiCheckCacheJsonBody
from .external_api_check_cache_json_body_tags import ExternalApiCheckCacheJsonBodyTags
from .external_api_check_cache_response_200 import ExternalApiCheckCacheResponse200
from .external_api_report_json_body import ExternalApiReportJsonBody
from .external_api_report_json_body_tags import ExternalApiReportJsonBodyTags
from .check_cache_json_body import CheckCacheJsonBody
from .check_cache_json_body_tags import CheckCacheJsonBodyTags
from .check_cache_response_200 import CheckCacheResponse200
from .local_testing_only_get_latest_logged_call_response_200 import LocalTestingOnlyGetLatestLoggedCallResponse200
from .local_testing_only_get_latest_logged_call_response_200_model_response import (
LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse,
)
from .local_testing_only_get_latest_logged_call_response_200_tags import (
LocalTestingOnlyGetLatestLoggedCallResponse200Tags,
)
from .report_json_body import ReportJsonBody
from .report_json_body_tags import ReportJsonBodyTags
from .report_response_200 import ReportResponse200
from .report_response_200_status import ReportResponse200Status
__all__ = (
"ExternalApiCheckCacheJsonBody",
"ExternalApiCheckCacheJsonBodyTags",
"ExternalApiCheckCacheResponse200",
"ExternalApiReportJsonBody",
"ExternalApiReportJsonBodyTags",
"CheckCacheJsonBody",
"CheckCacheJsonBodyTags",
"CheckCacheResponse200",
"LocalTestingOnlyGetLatestLoggedCallResponse200",
"LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse",
"LocalTestingOnlyGetLatestLoggedCallResponse200Tags",
"ReportJsonBody",
"ReportJsonBodyTags",
"ReportResponse200",
"ReportResponse200Status",
)

View File

@@ -5,25 +5,25 @@ from attrs import define
from ..types import UNSET, Unset
if TYPE_CHECKING:
from ..models.external_api_check_cache_json_body_tags import ExternalApiCheckCacheJsonBodyTags
from ..models.check_cache_json_body_tags import CheckCacheJsonBodyTags
T = TypeVar("T", bound="ExternalApiCheckCacheJsonBody")
T = TypeVar("T", bound="CheckCacheJsonBody")
@define
class ExternalApiCheckCacheJsonBody:
class CheckCacheJsonBody:
"""
Attributes:
requested_at (float): Unix timestamp in milliseconds
req_payload (Union[Unset, Any]): JSON-encoded request payload
tags (Union[Unset, ExternalApiCheckCacheJsonBodyTags]): Extra tags to attach to the call for filtering. Eg {
"userId": "123", "promptId": "populate-title" }
tags (Union[Unset, CheckCacheJsonBodyTags]): Extra tags to attach to the call for filtering. Eg { "userId":
"123", "promptId": "populate-title" }
"""
requested_at: float
req_payload: Union[Unset, Any] = UNSET
tags: Union[Unset, "ExternalApiCheckCacheJsonBodyTags"] = UNSET
tags: Union[Unset, "CheckCacheJsonBodyTags"] = UNSET
def to_dict(self) -> Dict[str, Any]:
requested_at = self.requested_at
@@ -47,7 +47,7 @@ class ExternalApiCheckCacheJsonBody:
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
from ..models.external_api_check_cache_json_body_tags import ExternalApiCheckCacheJsonBodyTags
from ..models.check_cache_json_body_tags import CheckCacheJsonBodyTags
d = src_dict.copy()
requested_at = d.pop("requestedAt")
@@ -55,16 +55,16 @@ class ExternalApiCheckCacheJsonBody:
req_payload = d.pop("reqPayload", UNSET)
_tags = d.pop("tags", UNSET)
tags: Union[Unset, ExternalApiCheckCacheJsonBodyTags]
tags: Union[Unset, CheckCacheJsonBodyTags]
if isinstance(_tags, Unset):
tags = UNSET
else:
tags = ExternalApiCheckCacheJsonBodyTags.from_dict(_tags)
tags = CheckCacheJsonBodyTags.from_dict(_tags)
external_api_check_cache_json_body = cls(
check_cache_json_body = cls(
requested_at=requested_at,
req_payload=req_payload,
tags=tags,
)
return external_api_check_cache_json_body
return check_cache_json_body

View File

@@ -2,11 +2,11 @@ from typing import Any, Dict, List, Type, TypeVar
from attrs import define, field
T = TypeVar("T", bound="ExternalApiReportJsonBodyTags")
T = TypeVar("T", bound="CheckCacheJsonBodyTags")
@define
class ExternalApiReportJsonBodyTags:
class CheckCacheJsonBodyTags:
"""Extra tags to attach to the call for filtering. Eg { "userId": "123", "promptId": "populate-title" }"""
additional_properties: Dict[str, str] = field(init=False, factory=dict)
@@ -21,10 +21,10 @@ class ExternalApiReportJsonBodyTags:
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
external_api_report_json_body_tags = cls()
check_cache_json_body_tags = cls()
external_api_report_json_body_tags.additional_properties = d
return external_api_report_json_body_tags
check_cache_json_body_tags.additional_properties = d
return check_cache_json_body_tags
@property
def additional_keys(self) -> List[str]:

View File

@@ -4,11 +4,11 @@ from attrs import define
from ..types import UNSET, Unset
T = TypeVar("T", bound="ExternalApiCheckCacheResponse200")
T = TypeVar("T", bound="CheckCacheResponse200")
@define
class ExternalApiCheckCacheResponse200:
class CheckCacheResponse200:
"""
Attributes:
resp_payload (Union[Unset, Any]): JSON-encoded response payload
@@ -31,8 +31,8 @@ class ExternalApiCheckCacheResponse200:
d = src_dict.copy()
resp_payload = d.pop("respPayload", UNSET)
external_api_check_cache_response_200 = cls(
check_cache_response_200 = cls(
resp_payload=resp_payload,
)
return external_api_check_cache_response_200
return check_cache_response_200

View File

@@ -0,0 +1,84 @@
import datetime
from typing import TYPE_CHECKING, Any, Dict, Optional, Type, TypeVar
from attrs import define
from dateutil.parser import isoparse
if TYPE_CHECKING:
from ..models.local_testing_only_get_latest_logged_call_response_200_model_response import (
LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse,
)
from ..models.local_testing_only_get_latest_logged_call_response_200_tags import (
LocalTestingOnlyGetLatestLoggedCallResponse200Tags,
)
T = TypeVar("T", bound="LocalTestingOnlyGetLatestLoggedCallResponse200")
@define
class LocalTestingOnlyGetLatestLoggedCallResponse200:
"""
Attributes:
created_at (datetime.datetime):
cache_hit (bool):
tags (LocalTestingOnlyGetLatestLoggedCallResponse200Tags):
model_response (Optional[LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse]):
"""
created_at: datetime.datetime
cache_hit: bool
tags: "LocalTestingOnlyGetLatestLoggedCallResponse200Tags"
model_response: Optional["LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse"]
def to_dict(self) -> Dict[str, Any]:
created_at = self.created_at.isoformat()
cache_hit = self.cache_hit
tags = self.tags.to_dict()
model_response = self.model_response.to_dict() if self.model_response else None
field_dict: Dict[str, Any] = {}
field_dict.update(
{
"createdAt": created_at,
"cacheHit": cache_hit,
"tags": tags,
"modelResponse": model_response,
}
)
return field_dict
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
from ..models.local_testing_only_get_latest_logged_call_response_200_model_response import (
LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse,
)
from ..models.local_testing_only_get_latest_logged_call_response_200_tags import (
LocalTestingOnlyGetLatestLoggedCallResponse200Tags,
)
d = src_dict.copy()
created_at = isoparse(d.pop("createdAt"))
cache_hit = d.pop("cacheHit")
tags = LocalTestingOnlyGetLatestLoggedCallResponse200Tags.from_dict(d.pop("tags"))
_model_response = d.pop("modelResponse")
model_response: Optional[LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse]
if _model_response is None:
model_response = None
else:
model_response = LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse.from_dict(_model_response)
local_testing_only_get_latest_logged_call_response_200 = cls(
created_at=created_at,
cache_hit=cache_hit,
tags=tags,
model_response=model_response,
)
return local_testing_only_get_latest_logged_call_response_200

View File

@@ -0,0 +1,70 @@
from typing import Any, Dict, Optional, Type, TypeVar, Union
from attrs import define
from ..types import UNSET, Unset
T = TypeVar("T", bound="LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse")
@define
class LocalTestingOnlyGetLatestLoggedCallResponse200ModelResponse:
"""
Attributes:
id (str):
status_code (Optional[float]):
error_message (Optional[str]):
req_payload (Union[Unset, Any]):
resp_payload (Union[Unset, Any]):
"""
id: str
status_code: Optional[float]
error_message: Optional[str]
req_payload: Union[Unset, Any] = UNSET
resp_payload: Union[Unset, Any] = UNSET
def to_dict(self) -> Dict[str, Any]:
id = self.id
status_code = self.status_code
error_message = self.error_message
req_payload = self.req_payload
resp_payload = self.resp_payload
field_dict: Dict[str, Any] = {}
field_dict.update(
{
"id": id,
"statusCode": status_code,
"errorMessage": error_message,
}
)
if req_payload is not UNSET:
field_dict["reqPayload"] = req_payload
if resp_payload is not UNSET:
field_dict["respPayload"] = resp_payload
return field_dict
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
id = d.pop("id")
status_code = d.pop("statusCode")
error_message = d.pop("errorMessage")
req_payload = d.pop("reqPayload", UNSET)
resp_payload = d.pop("respPayload", UNSET)
local_testing_only_get_latest_logged_call_response_200_model_response = cls(
id=id,
status_code=status_code,
error_message=error_message,
req_payload=req_payload,
resp_payload=resp_payload,
)
return local_testing_only_get_latest_logged_call_response_200_model_response

View File

@@ -0,0 +1,43 @@
from typing import Any, Dict, List, Optional, Type, TypeVar
from attrs import define, field
T = TypeVar("T", bound="LocalTestingOnlyGetLatestLoggedCallResponse200Tags")
@define
class LocalTestingOnlyGetLatestLoggedCallResponse200Tags:
""" """
additional_properties: Dict[str, Optional[str]] = field(init=False, factory=dict)
def to_dict(self) -> Dict[str, Any]:
field_dict: Dict[str, Any] = {}
field_dict.update(self.additional_properties)
field_dict.update({})
return field_dict
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
local_testing_only_get_latest_logged_call_response_200_tags = cls()
local_testing_only_get_latest_logged_call_response_200_tags.additional_properties = d
return local_testing_only_get_latest_logged_call_response_200_tags
@property
def additional_keys(self) -> List[str]:
return list(self.additional_properties.keys())
def __getitem__(self, key: str) -> Optional[str]:
return self.additional_properties[key]
def __setitem__(self, key: str, value: Optional[str]) -> None:
self.additional_properties[key] = value
def __delitem__(self, key: str) -> None:
del self.additional_properties[key]
def __contains__(self, key: str) -> bool:
return key in self.additional_properties

View File

@@ -5,14 +5,14 @@ from attrs import define
from ..types import UNSET, Unset
if TYPE_CHECKING:
from ..models.external_api_report_json_body_tags import ExternalApiReportJsonBodyTags
from ..models.report_json_body_tags import ReportJsonBodyTags
T = TypeVar("T", bound="ExternalApiReportJsonBody")
T = TypeVar("T", bound="ReportJsonBody")
@define
class ExternalApiReportJsonBody:
class ReportJsonBody:
"""
Attributes:
requested_at (float): Unix timestamp in milliseconds
@@ -21,8 +21,8 @@ class ExternalApiReportJsonBody:
resp_payload (Union[Unset, Any]): JSON-encoded response payload
status_code (Union[Unset, float]): HTTP status code of response
error_message (Union[Unset, str]): User-friendly error message
tags (Union[Unset, ExternalApiReportJsonBodyTags]): Extra tags to attach to the call for filtering. Eg {
"userId": "123", "promptId": "populate-title" }
tags (Union[Unset, ReportJsonBodyTags]): Extra tags to attach to the call for filtering. Eg { "userId": "123",
"promptId": "populate-title" }
"""
requested_at: float
@@ -31,7 +31,7 @@ class ExternalApiReportJsonBody:
resp_payload: Union[Unset, Any] = UNSET
status_code: Union[Unset, float] = UNSET
error_message: Union[Unset, str] = UNSET
tags: Union[Unset, "ExternalApiReportJsonBodyTags"] = UNSET
tags: Union[Unset, "ReportJsonBodyTags"] = UNSET
def to_dict(self) -> Dict[str, Any]:
requested_at = self.requested_at
@@ -66,7 +66,7 @@ class ExternalApiReportJsonBody:
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
from ..models.external_api_report_json_body_tags import ExternalApiReportJsonBodyTags
from ..models.report_json_body_tags import ReportJsonBodyTags
d = src_dict.copy()
requested_at = d.pop("requestedAt")
@@ -82,13 +82,13 @@ class ExternalApiReportJsonBody:
error_message = d.pop("errorMessage", UNSET)
_tags = d.pop("tags", UNSET)
tags: Union[Unset, ExternalApiReportJsonBodyTags]
tags: Union[Unset, ReportJsonBodyTags]
if isinstance(_tags, Unset):
tags = UNSET
else:
tags = ExternalApiReportJsonBodyTags.from_dict(_tags)
tags = ReportJsonBodyTags.from_dict(_tags)
external_api_report_json_body = cls(
report_json_body = cls(
requested_at=requested_at,
received_at=received_at,
req_payload=req_payload,
@@ -98,4 +98,4 @@ class ExternalApiReportJsonBody:
tags=tags,
)
return external_api_report_json_body
return report_json_body

View File

@@ -2,11 +2,11 @@ from typing import Any, Dict, List, Type, TypeVar
from attrs import define, field
T = TypeVar("T", bound="ExternalApiCheckCacheJsonBodyTags")
T = TypeVar("T", bound="ReportJsonBodyTags")
@define
class ExternalApiCheckCacheJsonBodyTags:
class ReportJsonBodyTags:
"""Extra tags to attach to the call for filtering. Eg { "userId": "123", "promptId": "populate-title" }"""
additional_properties: Dict[str, str] = field(init=False, factory=dict)
@@ -21,10 +21,10 @@ class ExternalApiCheckCacheJsonBodyTags:
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
external_api_check_cache_json_body_tags = cls()
report_json_body_tags = cls()
external_api_check_cache_json_body_tags.additional_properties = d
return external_api_check_cache_json_body_tags
report_json_body_tags.additional_properties = d
return report_json_body_tags
@property
def additional_keys(self) -> List[str]:

View File

@@ -0,0 +1,40 @@
from typing import Any, Dict, Type, TypeVar
from attrs import define
from ..models.report_response_200_status import ReportResponse200Status
T = TypeVar("T", bound="ReportResponse200")
@define
class ReportResponse200:
"""
Attributes:
status (ReportResponse200Status):
"""
status: ReportResponse200Status
def to_dict(self) -> Dict[str, Any]:
status = self.status.value
field_dict: Dict[str, Any] = {}
field_dict.update(
{
"status": status,
}
)
return field_dict
@classmethod
def from_dict(cls: Type[T], src_dict: Dict[str, Any]) -> T:
d = src_dict.copy()
status = ReportResponse200Status(d.pop("status"))
report_response_200 = cls(
status=status,
)
return report_response_200

View File

@@ -0,0 +1,8 @@
from enum import Enum
class ReportResponse200Status(str, Enum):
OK = "ok"
def __str__(self) -> str:
return str(self.value)

View File

@@ -1,9 +1,9 @@
from typing import Any, Optional
def merge_streamed_chunks(base: Optional[Any], chunk: Any) -> Any:
def merge_openai_chunks(base: Optional[Any], chunk: Any) -> Any:
if base is None:
return merge_streamed_chunks({**chunk, "choices": []}, chunk)
return merge_openai_chunks({**chunk, "choices": []}, chunk)
choices = base["choices"].copy()
for choice in chunk["choices"]:
@@ -34,9 +34,7 @@ def merge_streamed_chunks(base: Optional[Any], chunk: Any) -> Any:
{**new_choice, "message": {"role": "assistant", **choice["delta"]}}
)
merged = {
return {
**base,
"choices": choices,
}
return merged

View File

@@ -3,9 +3,16 @@ from openai.openai_object import OpenAIObject
import time
import inspect
from openpipe.merge_openai_chunks import merge_streamed_chunks
from openpipe.merge_openai_chunks import merge_openai_chunks
from openpipe.openpipe_meta import OpenPipeMeta
from .shared import maybe_check_cache, maybe_check_cache_async, report_async, report
from .shared import (
_should_check_cache,
maybe_check_cache,
maybe_check_cache_async,
report_async,
report,
)
class WrappedChatCompletion(original_openai.ChatCompletion):
@@ -29,9 +36,15 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
def _gen():
assembled_completion = None
for chunk in chat_completion:
assembled_completion = merge_streamed_chunks(
assembled_completion = merge_openai_chunks(
assembled_completion, chunk
)
cache_status = (
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
)
chunk.openpipe = OpenPipeMeta(cache_status=cache_status)
yield chunk
received_at = int(time.time() * 1000)
@@ -58,6 +71,10 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
status_code=200,
)
cache_status = (
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
)
chat_completion["openpipe"] = OpenPipeMeta(cache_status=cache_status)
return chat_completion
except Exception as e:
received_at = int(time.time() * 1000)
@@ -96,21 +113,28 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
requested_at = int(time.time() * 1000)
try:
chat_completion = original_openai.ChatCompletion.acreate(*args, **kwargs)
chat_completion = await original_openai.ChatCompletion.acreate(
*args, **kwargs
)
if inspect.isgenerator(chat_completion):
if inspect.isasyncgen(chat_completion):
def _gen():
async def _gen():
assembled_completion = None
for chunk in chat_completion:
assembled_completion = merge_streamed_chunks(
async for chunk in chat_completion:
assembled_completion = merge_openai_chunks(
assembled_completion, chunk
)
cache_status = (
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
)
chunk.openpipe = OpenPipeMeta(cache_status=cache_status)
yield chunk
received_at = int(time.time() * 1000)
report_async(
await report_async(
openpipe_options=openpipe_options,
requested_at=requested_at,
received_at=received_at,
@@ -123,7 +147,7 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
else:
received_at = int(time.time() * 1000)
report_async(
await report_async(
openpipe_options=openpipe_options,
requested_at=requested_at,
received_at=received_at,
@@ -132,12 +156,17 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
status_code=200,
)
cache_status = (
"MISS" if _should_check_cache(openpipe_options) else "SKIP"
)
chat_completion["openpipe"] = OpenPipeMeta(cache_status=cache_status)
return chat_completion
except Exception as e:
received_at = int(time.time() * 1000)
if isinstance(e, original_openai.OpenAIError):
report_async(
await report_async(
openpipe_options=openpipe_options,
requested_at=requested_at,
received_at=received_at,
@@ -147,7 +176,7 @@ class WrappedChatCompletion(original_openai.ChatCompletion):
status_code=e.http_status,
)
else:
report_async(
await report_async(
openpipe_options=openpipe_options,
requested_at=requested_at,
received_at=received_at,

View File

@@ -0,0 +1,7 @@
from attr import dataclass
@dataclass
class OpenPipeMeta:
# Cache status. One of 'HIT', 'MISS', 'SKIP'
cache_status: str

View File

@@ -1,10 +1,10 @@
from openpipe.api_client.api.default import (
external_api_report,
external_api_check_cache,
report as api_report,
check_cache,
)
from openpipe.api_client.client import AuthenticatedClient
from openpipe.api_client.models.external_api_report_json_body_tags import (
ExternalApiReportJsonBodyTags,
from openpipe.api_client.models.report_json_body_tags import (
ReportJsonBodyTags,
)
import toml
import time
@@ -19,9 +19,9 @@ configured_client = AuthenticatedClient(
def _get_tags(openpipe_options):
tags = openpipe_options.get("tags") or {}
tags["$sdk"] = "python"
tags["$sdk_version"] = version
tags["$sdk.version"] = version
return ExternalApiReportJsonBodyTags.from_dict(tags)
return ReportJsonBodyTags.from_dict(tags)
def _should_check_cache(openpipe_options):
@@ -31,7 +31,7 @@ def _should_check_cache(openpipe_options):
def _process_cache_payload(
payload: external_api_check_cache.ExternalApiCheckCacheResponse200,
payload: check_cache.CheckCacheResponse200,
):
if not payload or not payload.resp_payload:
return None
@@ -47,9 +47,9 @@ def maybe_check_cache(
if not _should_check_cache(openpipe_options):
return None
try:
payload = external_api_check_cache.sync(
payload = check_cache.sync(
client=configured_client,
json_body=external_api_check_cache.ExternalApiCheckCacheJsonBody(
json_body=check_cache.CheckCacheJsonBody(
req_payload=req_payload,
requested_at=int(time.time() * 1000),
tags=_get_tags(openpipe_options),
@@ -72,9 +72,9 @@ async def maybe_check_cache_async(
return None
try:
payload = await external_api_check_cache.asyncio(
payload = await check_cache.asyncio(
client=configured_client,
json_body=external_api_check_cache.ExternalApiCheckCacheJsonBody(
json_body=check_cache.CheckCacheJsonBody(
req_payload=req_payload,
requested_at=int(time.time() * 1000),
tags=_get_tags(openpipe_options),
@@ -94,9 +94,9 @@ def report(
**kwargs,
):
try:
external_api_report.sync_detailed(
api_report.sync_detailed(
client=configured_client,
json_body=external_api_report.ExternalApiReportJsonBody(
json_body=api_report.ReportJsonBody(
**kwargs,
tags=_get_tags(openpipe_options),
),
@@ -112,9 +112,9 @@ async def report_async(
**kwargs,
):
try:
await external_api_report.asyncio_detailed(
await api_report.asyncio_detailed(
client=configured_client,
json_body=external_api_report.ExternalApiReportJsonBody(
json_body=api_report.ReportJsonBody(
**kwargs,
tags=_get_tags(openpipe_options),
),

View File

@@ -1,55 +1,106 @@
from functools import reduce
from dotenv import load_dotenv
from . import openai, configure_openpipe
import os
import pytest
from . import openai, configure_openpipe, configured_client
from .api_client.api.default import local_testing_only_get_latest_logged_call
from .merge_openai_chunks import merge_openai_chunks
import random
import string
def random_string(length):
letters = string.ascii_lowercase
return "".join(random.choice(letters) for i in range(length))
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
configure_openpipe(
base_url="http://localhost:3000/api", api_key=os.getenv("OPENPIPE_API_KEY")
base_url="http://localhost:3000/api/v1", api_key=os.getenv("OPENPIPE_API_KEY")
)
def last_logged_call():
return local_testing_only_get_latest_logged_call.sync(client=configured_client)
def test_sync():
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "system", "content": "count to 10"}],
messages=[{"role": "system", "content": "count to 3"}],
)
print(completion.choices[0].message.content)
last_logged = last_logged_call()
assert (
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
== completion.choices[0].message.content
)
assert (
last_logged.model_response.req_payload["messages"][0]["content"] == "count to 3"
)
assert completion.openpipe.cache_status == "SKIP"
def test_streaming():
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "system", "content": "count to 10"}],
messages=[{"role": "system", "content": "count to 4"}],
stream=True,
)
for chunk in completion:
print(chunk)
merged = reduce(merge_openai_chunks, completion, None)
last_logged = last_logged_call()
assert (
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
== merged["choices"][0]["message"]["content"]
)
async def test_async():
acompletion = await openai.ChatCompletion.acreate(
completion = await openai.ChatCompletion.acreate(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "count down from 5"}],
)
last_logged = last_logged_call()
assert (
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
== completion.choices[0].message.content
)
assert (
last_logged.model_response.req_payload["messages"][0]["content"]
== "count down from 5"
)
print(acompletion.choices[0].message.content)
assert completion.openpipe.cache_status == "SKIP"
async def test_async_streaming():
acompletion = await openai.ChatCompletion.acreate(
completion = await openai.ChatCompletion.acreate(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "count down from 5"}],
stream=True,
)
async for chunk in acompletion:
print(chunk)
merged = None
async for chunk in completion:
assert chunk.openpipe.cache_status == "SKIP"
merged = merge_openai_chunks(merged, chunk)
last_logged = last_logged_call()
assert (
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
== merged["choices"][0]["message"]["content"]
)
assert (
last_logged.model_response.req_payload["messages"][0]["content"]
== "count down from 5"
)
assert merged["openpipe"].cache_status == "SKIP"
def test_sync_with_tags():
@@ -58,31 +109,54 @@ def test_sync_with_tags():
messages=[{"role": "system", "content": "count to 10"}],
openpipe={"tags": {"promptId": "testprompt"}},
)
print("finished")
print(completion.choices[0].message.content)
last_logged = last_logged_call()
assert (
last_logged.model_response.resp_payload["choices"][0]["message"]["content"]
== completion.choices[0].message.content
)
print(last_logged.tags)
assert last_logged.tags["promptId"] == "testprompt"
assert last_logged.tags["$sdk"] == "python"
def test_bad_call():
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo-blaster",
messages=[{"role": "system", "content": "count to 10"}],
stream=True,
try:
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo-blaster",
messages=[{"role": "system", "content": "count to 10"}],
stream=True,
)
assert False
except Exception as e:
pass
last_logged = last_logged_call()
print(last_logged)
assert (
last_logged.model_response.error_message
== "The model `gpt-3.5-turbo-blaster` does not exist"
)
assert last_logged.model_response.status_code == 404
@pytest.mark.focus
async def test_caching():
messages = [{"role": "system", "content": f"repeat '{random_string(10)}'"}]
completion = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "system", "content": "count to 10"}],
messages=messages,
openpipe={"cache": True},
)
assert completion.openpipe.cache_status == "MISS"
first_logged = last_logged_call()
assert (
completion.choices[0].message.content
== first_logged.model_response.resp_payload["choices"][0]["message"]["content"]
)
completion2 = await openai.ChatCompletion.acreate(
model="gpt-3.5-turbo",
messages=[{"role": "system", "content": "count to 10"}],
messages=messages,
openpipe={"cache": True},
)
print(completion2)
assert completion2.openpipe.cache_status == "HIT"

View File

@@ -1,133 +0,0 @@
{
"openapi": "3.0.3",
"info": {
"title": "OpenPipe API",
"description": "The public API for reporting API calls to OpenPipe",
"version": "0.1.0"
},
"servers": [{ "url": "https://app.openpipe.ai/api" }],
"paths": {
"/v1/check-cache": {
"post": {
"operationId": "externalApi-checkCache",
"description": "Check if a prompt is cached",
"security": [{ "Authorization": [] }],
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"startTime": {
"type": "number",
"description": "Unix timestamp in milliseconds"
},
"reqPayload": { "description": "JSON-encoded request payload" },
"tags": {
"type": "object",
"additionalProperties": { "type": "string" },
"description": "Extra tags to attach to the call for filtering. Eg { \"userId\": \"123\", \"promptId\": \"populate-title\" }"
}
},
"required": ["startTime"],
"additionalProperties": false
}
}
}
},
"parameters": [],
"responses": {
"200": {
"description": "Successful response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"respPayload": { "description": "JSON-encoded response payload" }
},
"additionalProperties": false
}
}
}
},
"default": { "$ref": "#/components/responses/error" }
}
}
},
"/v1/report": {
"post": {
"operationId": "externalApi-report",
"description": "Report an API call",
"security": [{ "Authorization": [] }],
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"startTime": {
"type": "number",
"description": "Unix timestamp in milliseconds"
},
"endTime": { "type": "number", "description": "Unix timestamp in milliseconds" },
"reqPayload": { "description": "JSON-encoded request payload" },
"respPayload": { "description": "JSON-encoded response payload" },
"respStatus": { "type": "number", "description": "HTTP status code of response" },
"error": { "type": "string", "description": "User-friendly error message" },
"tags": {
"type": "object",
"additionalProperties": { "type": "string" },
"description": "Extra tags to attach to the call for filtering. Eg { \"userId\": \"123\", \"promptId\": \"populate-title\" }"
}
},
"required": ["startTime", "endTime"],
"additionalProperties": false
}
}
}
},
"parameters": [],
"responses": {
"200": {
"description": "Successful response",
"content": { "application/json": { "schema": {} } }
},
"default": { "$ref": "#/components/responses/error" }
}
}
}
},
"components": {
"securitySchemes": { "Authorization": { "type": "http", "scheme": "bearer" } },
"responses": {
"error": {
"description": "Error response",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"message": { "type": "string" },
"code": { "type": "string" },
"issues": {
"type": "array",
"items": {
"type": "object",
"properties": { "message": { "type": "string" } },
"required": ["message"],
"additionalProperties": false
}
}
},
"required": ["message", "code"],
"additionalProperties": false
}
}
}
}
}
}
}

Some files were not shown because too many files have changed in this diff Show More