npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@airbend3r/client

v0.1.13

Published

The Airbender Client provides a comprehensive logging and control system for various AI providers (OpenAI, Anthropic, Google) and the Vercel AI SDK. It offers functionality for logging interactions, handling feedback, and dynamic model control.

Downloads

83

Readme

Airbender Client

The Airbender Client provides a comprehensive logging and control system for various AI providers (OpenAI, Anthropic, Google) and the Vercel AI SDK. It offers functionality for logging interactions, handling feedback, and dynamic model control.

Installation

npm install @airbend3r/client

Getting Started

1. Obtain Your Product Key

Before setting up the client, you'll need to obtain a product key from the Airbender Dashboard:

  1. Create a new product in the Airbender Dashboard
  2. Once created, you'll find your product key in the product settings:

Product Key Location

  1. Store this key securely in your environment variables:
# .env
AIRBENDER_PRODUCT_KEY=your_product_key_here

2. Basic Setup

The Airbender client uses a centralized configuration approach. For Vercel AI SDK implementations, you first need to wrap the AI functions with Airbender's logging wrapper:

import { setupAirbender, wrappedStreamText } from '@airbend3r/client';

// First, create a wrapped version of streamText with logging configuration
const streamText = wrappedStreamText({
  productKey: process.env.AIRBENDER_PRODUCT_KEY, // The product key you obtained from the dashboard
  logInputs: true,
  logOutputs: true,
  shouldValidateBeforeLogging: true,
});

// Then use the wrapped version in Airbender setup
export const airbender = setupAirbender({
  sdks: {
    default: {
      llm: streamText,
      name: 'openAi',
      version: 'gpt-4o',
    },
    custom: {
      llm: streamText,
      name: 'anthropic',
      version: 'claude-3-5-sonnet-20240620',
    }
  },
  productId: process.env.AIRBENDER_PRODUCT_ID,
  modelAvailability: {
    providers: {
      openai: ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'],
      google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
      anthropic: ['claude-3-5-sonnet-20240620'],
    }
  },
});

// Access specific SDK configurations
export const { llm: airbenderStreamText } = airbender.sdk('default');

## Setup and Configuration

### Basic Setup

The Airbender client uses a centralized configuration approach. For Vercel AI SDK implementations, you first need to wrap the AI functions with Airbender's logging wrapper:

```typescript
import { setupAirbender, wrappedStreamText } from '@airbend3r/client';

// First, create a wrapped version of streamText with logging configuration
const streamText = wrappedStreamText({
  productKey: process.env.AIRBENDER_PRODUCT_KEY,
  logInputs: true,
  logOutputs: true,
  shouldValidateBeforeLogging: true,
});

// Then use the wrapped version in Airbender setup
export const airbender = setupAirbender({
  sdks: {
    default: {
      llm: streamText,
      name: 'openAi',
      version: 'gpt-4o',
    },
    custom: {
      llm: streamText,
      name: 'anthropic',
      version: 'claude-3-5-sonnet-20240620',
    }
  },
  productId: process.env.AIRBENDER_PRODUCT_ID,
  modelAvailability: {
    providers: {
      openai: ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'],
      google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
      anthropic: ['claude-3-5-sonnet-20240620'],
    }
  },
});

// Access specific SDK configurations
export const { llm: airbenderStreamText } = airbender.sdk('default');

Multiple SDK Configurations

You can define multiple SDK configurations with different wrapped implementations:

// Create wrapped versions of different AI functions
const streamText = wrappedStreamText({
  productKey: process.env.AIRBENDER_PRODUCT_KEY,
  logInputs: true,
  logOutputs: true,
});

const generateText = wrappedGenerateText({
  productKey: process.env.AIRBENDER_PRODUCT_KEY,
  logInputs: true,
  logOutputs: true,
});

const airbender = setupAirbender({
  sdks: {
    chat: {
      llm: streamText,
      name: 'openAi',
      version: 'gpt-4o',
    },
    completion: {
      llm: generateText,
      name: 'anthropic',
      version: 'claude-3-5-sonnet-20240620',
    },
    search: {
      llm: streamText,
      name: 'google',
      version: 'gemini-1.5-pro',
    }
  },
  productId: process.env.AIRBENDER_PRODUCT_ID,
  modelAvailability: {
    providers: {
      openai: ['gpt-4o', 'gpt-4o-mini'],
      google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
      anthropic: ['claude-3-5-sonnet-20240620'],
    }
  },
});

Usage Examples

Using Configured SDKs

Stream Text Example

const { llm: streamTextWithLogging } = airbender.sdk('chat');

export async function POST(req: Request) {
  const { messages } = await req.json();
  
  return streamTextWithLogging(
    {
      model: { provider: 'openai', modelId: 'gpt-4o' },
      messages,
      system: "You are a helpful assistant",
    },
    {
      sessionID: 'user-123',
      dynamicModel: true, // Enable server-side model control
      logName: 'chat-interaction'
    }
  );
}

Generate Text Example

const { llm: generateTextWithLogging } = airbender.sdk('completion');

const response = await generateTextWithLogging(
  {
    model: { provider: 'anthropic', modelId: 'claude-3-5-sonnet-20240620' },
    messages: [{ role: 'user', content: 'Hello!' }],
    system: "You are a helpful assistant",
  },
  {
    sessionID: 'user-123',
    dynamicModel: true,
  }
);

Advanced Features

Dynamic Model Selection

const { llm } = airbender.sdk('chat');

const response = await llm(
  {
    model: { provider: 'openai', modelId: 'gpt-4o' },
    messages: [{ role: 'user', content: 'Hello!' }],
  },
  {
    dynamicModel: true,
    sessionID: 'user-123',
  }
);

System Prompt Override

const { llm } = airbender.sdk('chat');

const response = await llm(
  {
    model: { provider: 'openai', modelId: 'gpt-4o' },
    messages: messages,
    system: "Default system prompt",
  },
  {
    sessionID: 'user-123',
    dynamicModel: true,
  }
);

Logging with Names

const { llm } = airbender.sdk('chat');

const response = await llm(
  {
    model: { provider: 'openai', modelId: 'gpt-4o' },
    messages: [{ role: 'user', content: 'Hello!' }],
  },
  {
    sessionID: 'user-123',
    logName: 'customer-support-chat',
  }
);

Feedback Handling

const feedback = await airbender.logFeedback(logId, {
  rating: 5,
  comment: 'Great response!',
  id: 'feedback-123'
});

Configuration Options

AirbenderConfigProps Interface

interface AirbenderConfigProps<T> {
  sdks: {
    [K in keyof T]: {
      llm: any;
      name: string;
      version: string;
    }
  };
  productId: string;
  modelAvailability?: {
    providers: {
      openai?: string[];
      anthropic?: string[];
      google?: string[];
    }
  };
}

LoggerConfig Interface

interface LoggerConfig {
  logInputs?: boolean;
  logOutputs?: boolean;
  additionalInfo?: any;
  onResponse?: (response: any) => void;
  onLogId?: (logId: string) => void;
  sessionID?: string;
  productKey: string;
  blockingConfig?: {
    messageOnBlock: string;
  };
  shouldValidateBeforeLogging?: boolean;
  logName?: string;
  dynamicModel?: boolean;
}

Error Handling

try {
  const { llm } = airbender.sdk('chat');
  const response = await llm(
    {
      model: { provider: 'openai', modelId: 'gpt-4o' },
      messages: [{ role: 'user', content: 'Hello!' }],
    },
    {
      sessionID: 'user-123',
    }
  );
} catch (error) {
  if (error.message.includes('blocked')) {
    // Handle blocked request
  } else {
    // Handle other errors
  }
}

License

MIT License