@airbend3r/client
v0.1.13
Published
The Airbender Client provides a comprehensive logging and control system for various AI providers (OpenAI, Anthropic, Google) and the Vercel AI SDK. It offers functionality for logging interactions, handling feedback, and dynamic model control.
Downloads
83
Keywords
Readme
Airbender Client
The Airbender Client provides a comprehensive logging and control system for various AI providers (OpenAI, Anthropic, Google) and the Vercel AI SDK. It offers functionality for logging interactions, handling feedback, and dynamic model control.
Installation
npm install @airbend3r/client
Getting Started
1. Obtain Your Product Key
Before setting up the client, you'll need to obtain a product key from the Airbender Dashboard:
- Create a new product in the Airbender Dashboard
- Once created, you'll find your product key in the product settings:
- Store this key securely in your environment variables:
# .env
AIRBENDER_PRODUCT_KEY=your_product_key_here
2. Basic Setup
The Airbender client uses a centralized configuration approach. For Vercel AI SDK implementations, you first need to wrap the AI functions with Airbender's logging wrapper:
import { setupAirbender, wrappedStreamText } from '@airbend3r/client';
// First, create a wrapped version of streamText with logging configuration
const streamText = wrappedStreamText({
productKey: process.env.AIRBENDER_PRODUCT_KEY, // The product key you obtained from the dashboard
logInputs: true,
logOutputs: true,
shouldValidateBeforeLogging: true,
});
// Then use the wrapped version in Airbender setup
export const airbender = setupAirbender({
sdks: {
default: {
llm: streamText,
name: 'openAi',
version: 'gpt-4o',
},
custom: {
llm: streamText,
name: 'anthropic',
version: 'claude-3-5-sonnet-20240620',
}
},
productId: process.env.AIRBENDER_PRODUCT_ID,
modelAvailability: {
providers: {
openai: ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'],
google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
anthropic: ['claude-3-5-sonnet-20240620'],
}
},
});
// Access specific SDK configurations
export const { llm: airbenderStreamText } = airbender.sdk('default');
## Setup and Configuration
### Basic Setup
The Airbender client uses a centralized configuration approach. For Vercel AI SDK implementations, you first need to wrap the AI functions with Airbender's logging wrapper:
```typescript
import { setupAirbender, wrappedStreamText } from '@airbend3r/client';
// First, create a wrapped version of streamText with logging configuration
const streamText = wrappedStreamText({
productKey: process.env.AIRBENDER_PRODUCT_KEY,
logInputs: true,
logOutputs: true,
shouldValidateBeforeLogging: true,
});
// Then use the wrapped version in Airbender setup
export const airbender = setupAirbender({
sdks: {
default: {
llm: streamText,
name: 'openAi',
version: 'gpt-4o',
},
custom: {
llm: streamText,
name: 'anthropic',
version: 'claude-3-5-sonnet-20240620',
}
},
productId: process.env.AIRBENDER_PRODUCT_ID,
modelAvailability: {
providers: {
openai: ['gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'],
google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
anthropic: ['claude-3-5-sonnet-20240620'],
}
},
});
// Access specific SDK configurations
export const { llm: airbenderStreamText } = airbender.sdk('default');
Multiple SDK Configurations
You can define multiple SDK configurations with different wrapped implementations:
// Create wrapped versions of different AI functions
const streamText = wrappedStreamText({
productKey: process.env.AIRBENDER_PRODUCT_KEY,
logInputs: true,
logOutputs: true,
});
const generateText = wrappedGenerateText({
productKey: process.env.AIRBENDER_PRODUCT_KEY,
logInputs: true,
logOutputs: true,
});
const airbender = setupAirbender({
sdks: {
chat: {
llm: streamText,
name: 'openAi',
version: 'gpt-4o',
},
completion: {
llm: generateText,
name: 'anthropic',
version: 'claude-3-5-sonnet-20240620',
},
search: {
llm: streamText,
name: 'google',
version: 'gemini-1.5-pro',
}
},
productId: process.env.AIRBENDER_PRODUCT_ID,
modelAvailability: {
providers: {
openai: ['gpt-4o', 'gpt-4o-mini'],
google: ['gemini-1.5-flash', 'gemini-1.5-pro'],
anthropic: ['claude-3-5-sonnet-20240620'],
}
},
});
Usage Examples
Using Configured SDKs
Stream Text Example
const { llm: streamTextWithLogging } = airbender.sdk('chat');
export async function POST(req: Request) {
const { messages } = await req.json();
return streamTextWithLogging(
{
model: { provider: 'openai', modelId: 'gpt-4o' },
messages,
system: "You are a helpful assistant",
},
{
sessionID: 'user-123',
dynamicModel: true, // Enable server-side model control
logName: 'chat-interaction'
}
);
}
Generate Text Example
const { llm: generateTextWithLogging } = airbender.sdk('completion');
const response = await generateTextWithLogging(
{
model: { provider: 'anthropic', modelId: 'claude-3-5-sonnet-20240620' },
messages: [{ role: 'user', content: 'Hello!' }],
system: "You are a helpful assistant",
},
{
sessionID: 'user-123',
dynamicModel: true,
}
);
Advanced Features
Dynamic Model Selection
const { llm } = airbender.sdk('chat');
const response = await llm(
{
model: { provider: 'openai', modelId: 'gpt-4o' },
messages: [{ role: 'user', content: 'Hello!' }],
},
{
dynamicModel: true,
sessionID: 'user-123',
}
);
System Prompt Override
const { llm } = airbender.sdk('chat');
const response = await llm(
{
model: { provider: 'openai', modelId: 'gpt-4o' },
messages: messages,
system: "Default system prompt",
},
{
sessionID: 'user-123',
dynamicModel: true,
}
);
Logging with Names
const { llm } = airbender.sdk('chat');
const response = await llm(
{
model: { provider: 'openai', modelId: 'gpt-4o' },
messages: [{ role: 'user', content: 'Hello!' }],
},
{
sessionID: 'user-123',
logName: 'customer-support-chat',
}
);
Feedback Handling
const feedback = await airbender.logFeedback(logId, {
rating: 5,
comment: 'Great response!',
id: 'feedback-123'
});
Configuration Options
AirbenderConfigProps Interface
interface AirbenderConfigProps<T> {
sdks: {
[K in keyof T]: {
llm: any;
name: string;
version: string;
}
};
productId: string;
modelAvailability?: {
providers: {
openai?: string[];
anthropic?: string[];
google?: string[];
}
};
}
LoggerConfig Interface
interface LoggerConfig {
logInputs?: boolean;
logOutputs?: boolean;
additionalInfo?: any;
onResponse?: (response: any) => void;
onLogId?: (logId: string) => void;
sessionID?: string;
productKey: string;
blockingConfig?: {
messageOnBlock: string;
};
shouldValidateBeforeLogging?: boolean;
logName?: string;
dynamicModel?: boolean;
}
Error Handling
try {
const { llm } = airbender.sdk('chat');
const response = await llm(
{
model: { provider: 'openai', modelId: 'gpt-4o' },
messages: [{ role: 'user', content: 'Hello!' }],
},
{
sessionID: 'user-123',
}
);
} catch (error) {
if (error.message.includes('blocked')) {
// Handle blocked request
} else {
// Handle other errors
}
}
License
MIT License