Files
arcticfly 5df898a4f6 Add support for CommonJS in sdk (#222)
* trying rollup

* Imports mostly working

* Remove yalc.lock

* Use tsup to build

* Disable code splitting

* Update cjs exports

* Use openpipe from npm registry in app

* Remove npmignore, fix lint

* Copy the README to dist

* Update openpipe version

* Remove second openai entrypoint

* Update openpipe version

---------

Co-authored-by: Kyle Corbitt <kyle@corbt.com>
2023-09-12 14:30:37 -07:00
..

OpenPipe Node API Library

NPM version

This library wraps TypeScript or Javascript OpenAI API calls and logs additional data to the configured OPENPIPE_BASE_URL for further processing.

It is fully compatible with OpenAI's sdk and logs both streaming and non-streaming requests and responses.

Installation

npm install --save openpipe
# or
yarn add openpipe

Usage

  1. Create a project at https://app.openpipe.ai
  2. Find your project's API key at https://app.openpipe.ai/project/settings
  3. Configure the OpenPipe client as shown below.
// import OpenAI from 'openai'
import OpenAI from "openpipe/openai";

// Fully compatible with original OpenAI initialization
const openai = new OpenAI({
  apiKey: "my api key", // defaults to process.env["OPENAI_API_KEY"]
  // openpipe key is optional
  openpipe: {
    apiKey: "my api key", // defaults to process.env["OPENPIPE_API_KEY"]
    baseUrl: "my url", // defaults to process.env["OPENPIPE_BASE_URL"] or https://app.openpipe.ai/api/v1 if not set
  },
});

async function main() {
  // Allows optional openpipe object
  const completion = await openai.chat.completions.create({
    messages: [{ role: "user", content: "Say this is a test" }],
    model: "gpt-3.5-turbo",
    // optional
    openpipe: {
      // Add custom searchable tags
      tags: {
        prompt_id: "getCompletion",
        any_key: "any_value",
      },
    },
  });

  console.log(completion.choices);
}

main();

FAQ

How do I report calls to my self-hosted instance?

Start an instance by following the instructions on Running Locally. Once it's running, point your OPENPIPE_BASE_URL to your self-hosted instance.

What if my OPENPIPE_BASE_URL is misconfigured or my instance goes down? Will my OpenAI calls stop working?

Your OpenAI calls will continue to function as expected no matter what. The sdk handles logging errors gracefully without affecting OpenAI inference.

See the GitHub repo for more details.