@arizeai/openinference-vercel
v1.0.0
Published
OpenInference utilities for ingesting Vercel AI SDK spans
Downloads
216
Readme
OpenInference Vercel
This package provides a set of utilities to ingest Vercel AI SDK(>= 3.3) spans into platforms like Arize and Phoenix.
Installation
npm install --save @arizeai/openinference-vercel
Usage
To process your Vercel AI SDK Spans add the OpenInferenceSpanProcessor
to your span processors along with any other span processors you wish to use.
Note: The
OpenInferenceSpanProcessor
does not handle the exporting of spans so you will want to pair it with another span processor that accepts an exporter as a parameter.
import { registerOTel } from "@vercel/otel";
import { OpenInferenceSpanProcessor } from "@arizeai/openinference-vercel";
import { OTLPHttpProtoTraceExporter } from "@vercel/otel";
import { SimpleSpanProcessor } from "@opentelemetry/sdk-trace-base";
export function register() {
registerOTel({
serviceName: "next-app",
spanProcessors: [
new OpenInferenceSpanProcessor(),
new SimpleSpanProcessor(
new OTLPHttpProtoTraceExporter({
url: "http://localhost:6006/v1/traces",
}),
),
],
});
}
Examples
To see an example go to the Next.js OpenAI Telemetry Example in the examples directory of this repo.
For more information on Vercel OpenTelemetry support see the Vercel AI SDK Telemetry documentation.