npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@livepeer/ai

v0.3.3

Published

<div align="left"> <a href="https://www.speakeasy.com/?utm_source=livepeer-ai&utm_campaign=typescript"><img src="https://custom-icon-badges.demolab.com/badge/-Built%20By%20Speakeasy-212015?style=for-the-badge&logoColor=FBE331&logo=speakeasy&labelColor

Downloads

136

Readme

Livepeer AI JavaScript/TypeScript Library

Welcome to the Livepeer AI JavaScript/TypeScript Library! This library offers a seamless integration with the Livepeer AI API, enabling you to easily incorporate powerful AI capabilities into your JavaScript applications, whether they run in the browser or on the server side.

SDK Installation

The SDK can be installed with either npm, pnpm, bun or yarn package managers.

NPM

npm add @livepeer/ai

PNPM

pnpm add @livepeer/ai

Bun

bun add @livepeer/ai

Yarn

yarn add @livepeer/ai zod

# Note that Yarn does not install peer dependencies automatically. You will need
# to install zod as shown above.

Requirements

For supported JavaScript runtimes, please consult RUNTIMES.md.

SDK Example Usage

Example

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Available Resources and Operations

generate

Standalone functions

All the methods listed above are available as standalone functions. These functions are ideal for use in applications running in the browser, serverless runtimes or other environments where application bundle size is a primary concern. When using a bundler to build your application, all unused functionality will be either excluded from the final bundle or tree-shaken away.

To read more about standalone functions, check FUNCTIONS.md.

File uploads

Certain SDK methods accept files as part of a multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.

[!TIP]

Depending on your JavaScript runtime, there are convenient utilities that return a handle to a file without reading the entire contents into memory:

  • Node.js v20+: Since v20, Node.js comes with a native openAsBlob function in node:fs.
  • Bun: The native Bun.file function produces a file handle that can be used for streaming file uploads.
  • Browsers: All supported browsers return an instance to a File when reading the value from an <input type="file"> element.
  • Node.js v18: A file stream can be created using the fileFrom helper from fetch-blob/from.js.
import { Livepeer } from "@livepeer/ai";
import { openAsBlob } from "node:fs";

const livepeer = new Livepeer({
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.imageToImage({
    image: await openAsBlob("example.file"),
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Retries

Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.

To change the default retry strategy for a single API call, simply provide a retryConfig object to the call:

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  }, {
    retries: {
      strategy: "backoff",
      backoff: {
        initialInterval: 1,
        maxInterval: 50,
        exponent: 1.1,
        maxElapsedTime: 100,
      },
      retryConnectionErrors: false,
    },
  });

  // Handle the result
  console.log(result);
}

run();

If you'd like to override the default retry strategy for all operations that support retries, you can provide a retryConfig at SDK initialization:

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  retryConfig: {
    strategy: "backoff",
    backoff: {
      initialInterval: 1,
      maxInterval: 50,
      exponent: 1.1,
      maxElapsedTime: 100,
    },
    retryConnectionErrors: false,
  },
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Error Handling

All SDK methods return a response object or throw an error. If Error objects are specified in your OpenAPI Spec, the SDK will throw the appropriate Error type.

| Error Object | Status Code | Content Type | | -------------------------- | -------------------------- | -------------------------- | | errors.HTTPError | 400,401,500 | application/json | | errors.HTTPValidationError | 422 | application/json | | errors.SDKError | 4xx-5xx | / |

Validation errors can also occur when either method arguments or data returned from the server do not match the expected format. The SDKValidationError that is thrown as a result will capture the raw value that failed validation in an attribute called rawValue. Additionally, a pretty() method is available on this error that can be used to log a nicely formatted string since validation errors can list many issues and the plain error string may be difficult read when debugging.

import { Livepeer } from "@livepeer/ai";
import {
  HTTPError,
  HTTPValidationError,
  SDKValidationError,
} from "@livepeer/ai/models/errors";

const livepeer = new Livepeer({
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  let result;
  try {
    result = await livepeer.generate.textToImage({
      prompt: "<value>",
    });

    // Handle the result
    console.log(result);
  } catch (err) {
    switch (true) {
      case (err instanceof SDKValidationError): {
        // Validation errors can be pretty-printed
        console.error(err.pretty());
        // Raw value may also be inspected
        console.error(err.rawValue);
        return;
      }
      case (err instanceof HTTPError): {
        // Handle err.data$: HTTPErrorData
        console.error(err);
        return;
      }
      case (err instanceof HTTPValidationError): {
        // Handle err.data$: HTTPValidationErrorData
        console.error(err);
        return;
      }
      default: {
        throw err;
      }
    }
  }
}

run();

Server Selection

Select Server by Index

You can override the default server globally by passing a server index to the serverIdx optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the indexes associated with the available servers:

| # | Server | Variables | | - | ------ | --------- | | 0 | https://dream-gateway.livepeer.cloud | None | | 1 | https://livepeer.studio/api/beta/generate | None |

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  serverIdx: 1,
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the serverURL optional parameter when initializing the SDK client instance. For example:

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  serverURL: "https://dream-gateway.livepeer.cloud",
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Custom HTTP Client

The TypeScript SDK makes API calls using an HTTPClient that wraps the native Fetch API. This client is a thin wrapper around fetch and provides the ability to attach hooks around the request lifecycle that can be used to modify the request or handle errors and response.

The HTTPClient constructor takes an optional fetcher argument that can be used to integrate a third-party HTTP client or when writing tests to mock out the HTTP client and feed in fixtures.

The following example shows how to use the "beforeRequest" hook to to add a custom header and a timeout to requests and how to use the "requestError" hook to log errors:

import { Livepeer } from "@livepeer/ai";
import { HTTPClient } from "@livepeer/ai/lib/http";

const httpClient = new HTTPClient({
  // fetcher takes a function that has the same signature as native `fetch`.
  fetcher: (request) => {
    return fetch(request);
  }
});

httpClient.addHook("beforeRequest", (request) => {
  const nextRequest = new Request(request, {
    signal: request.signal || AbortSignal.timeout(5000)
  });

  nextRequest.headers.set("x-custom-header", "custom value");

  return nextRequest;
});

httpClient.addHook("requestError", (error, request) => {
  console.group("Request Error");
  console.log("Reason:", `${error}`);
  console.log("Endpoint:", `${request.method} ${request.url}`);
  console.groupEnd();
});

const sdk = new Livepeer({ httpClient });

Authentication

Per-Client Security Schemes

This SDK supports the following security scheme globally:

| Name | Type | Scheme | | ------------ | ------------ | ------------ | | httpBearer | http | HTTP Bearer |

To authenticate with the API the httpBearer parameter must be set when initializing the SDK client instance. For example:

import { Livepeer } from "@livepeer/ai";

const livepeer = new Livepeer({
  httpBearer: "<YOUR_BEARER_TOKEN_HERE>",
});

async function run() {
  const result = await livepeer.generate.textToImage({
    prompt: "<value>",
  });

  // Handle the result
  console.log(result);
}

run();

Debugging

You can setup your SDK to emit debug logs for SDK requests and responses.

You can pass a logger that matches console's interface as an SDK option.

[!WARNING] Beware that debug logging will reveal secrets, like API tokens in headers, in log messages printed to a console or files. It's recommended to use this feature only during local development and not in production.

import { Livepeer } from "@livepeer/ai";

const sdk = new Livepeer({ debugLogger: console });

Summary

Livepeer AI Runner: An application to run AI pipelines

Table of Contents

Development

Maturity

This SDK is in alpha, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally looking for the latest version.

Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. We look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release.

SDK Created by Speakeasy