llm-polyglot
v2.2.0
Published
A universal LLM client - provides adapters for various LLM providers to adhere to a universal interface - the openai sdk - allows you to use providers like anthropic using the same openai interface and transforms the responses in the same way - this allow
Downloads
7,364
Maintainers
Readme
llm-polyglot
this is still in beta and may not be ready for production use - documentation is incomplete
Installation
with pnpm
$ pnpm add llm-polyglot openai
with npm
$ npm install llm-polyglot openai
with bun
$ bun add llm-polyglot openai
Basic Usage
const anthropicClient = createLLMClient({
provider: "anthropic"
})
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{
role: "user",
content: "hey how are you"
}
]
})
Anthropic
The llm-polyglot library provides support for Anthropic's API, including standard chat completions, streaming chat completions, and function calling. Both input paramaters and responses match exactly those of the OpenAI SDK - for more detailed documentation please see the OpenAI docs: https://platform.openai.com/docs/api-reference
The anthropic sdk is required when using the anthropic provider - we only use the types provided by the sdk.
bun add @anthropic-ai/sdk
Standard Chat Completions
To create a standard chat completion using the Anthropic API, you can use the create method on the chat.completions object:
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{ role: "user", content: "My name is Dimitri Kennedy." }
]
});
Streaming Chat Completions
To create a streaming chat completion using the Anthropic API, you can set the stream option to true in the create method:
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
stream: true,
messages: [
{ role: "user", content: "hey how are you" }
]
});
let final = "";
for await (const data of completion) {
final += data.choices?.[0]?.delta?.content ?? "";
}
Function Calling
The llm-polyglot library supports function calling for the Anthropic API. To use this feature, you need to provide the tool_choice (optional) and tools options in the create method
Anthropic does not support the tool_choice option, so this instead appends the instruction to use the provided tool to the latest user message.
const completion = await anthropicClient.chat.completions.create({
model: "claude-3-opus-20240229",
max_tokens: 1000,
messages: [
{ role: "user", content: "My name is Dimitri Kennedy." }
],
tool_choice: {
type: "function",
function: {
name: "say_hello"
}
},
tools: [
{
type: "function",
function: {
name: "say_hello",
description: "Say hello",
parameters: {
type: "object",
properties: {
name: { type: "string" }
},
required: ["name"],
additionalProperties: false
}
}
}
]
});
The tool_choice option specifies the function to call, and the tools option defines the available functions and their parameters. The response from the Anthropic API will include the function call and its arguments in the tool_calls field.
OpenAI The llm-polyglot library also provides support for the OpenAI API, which is the default provider and will just proxy directly to the OpenAI sdk.
Contributing Contributions are welcome! Please open an issue or submit a pull request if you have any improvements, bug fixes, or new features to add.
License This project is licensed under the MIT License.