npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

llm-polyglot

v2.2.0

Published

A universal LLM client - provides adapters for various LLM providers to adhere to a universal interface - the openai sdk - allows you to use providers like anthropic using the same openai interface and transforms the responses in the same way - this allow

Downloads

8,297

Readme

llm-polyglot

this is still in beta and may not be ready for production use - documentation is incomplete


Installation

with pnpm

$ pnpm add llm-polyglot openai

with npm

$ npm install llm-polyglot openai

with bun

$ bun add llm-polyglot openai

Basic Usage

  const anthropicClient = createLLMClient({
    provider: "anthropic"
  })

  const completion = await anthropicClient.chat.completions.create({
    model: "claude-3-opus-20240229",
    max_tokens: 1000,
    messages: [
      {
        role: "user",
        content: "hey how are you"
      }
    ]
  })

Anthropic

The llm-polyglot library provides support for Anthropic's API, including standard chat completions, streaming chat completions, and function calling. Both input paramaters and responses match exactly those of the OpenAI SDK - for more detailed documentation please see the OpenAI docs: https://platform.openai.com/docs/api-reference

The anthropic sdk is required when using the anthropic provider - we only use the types provided by the sdk.

  bun add @anthropic-ai/sdk

Standard Chat Completions

To create a standard chat completion using the Anthropic API, you can use the create method on the chat.completions object:

const completion = await anthropicClient.chat.completions.create({
  model: "claude-3-opus-20240229",
  max_tokens: 1000,
  messages: [
    { role: "user", content: "My name is Dimitri Kennedy." }
  ]
});

Streaming Chat Completions

To create a streaming chat completion using the Anthropic API, you can set the stream option to true in the create method:

const completion = await anthropicClient.chat.completions.create({
  model: "claude-3-opus-20240229",
  max_tokens: 1000,
  stream: true,
  messages: [
    { role: "user", content: "hey how are you" }
  ]
});

let final = "";
for await (const data of completion) {
  final += data.choices?.[0]?.delta?.content ?? "";
}

Function Calling

The llm-polyglot library supports function calling for the Anthropic API. To use this feature, you need to provide the tool_choice (optional) and tools options in the create method

Anthropic does not support the tool_choice option, so this instead appends the instruction to use the provided tool to the latest user message.

const completion = await anthropicClient.chat.completions.create({
  model: "claude-3-opus-20240229",
  max_tokens: 1000,
  messages: [
    { role: "user", content: "My name is Dimitri Kennedy." }
  ],
  tool_choice: {
    type: "function",
    function: {
      name: "say_hello"
    }
  },
  tools: [
    {
      type: "function",
      function: {
        name: "say_hello",
        description: "Say hello",
        parameters: {
          type: "object",
          properties: {
            name: { type: "string" }
          },
          required: ["name"],
          additionalProperties: false
        }
      }
    }
  ]
});

The tool_choice option specifies the function to call, and the tools option defines the available functions and their parameters. The response from the Anthropic API will include the function call and its arguments in the tool_calls field.

OpenAI The llm-polyglot library also provides support for the OpenAI API, which is the default provider and will just proxy directly to the OpenAI sdk.

Contributing Contributions are welcome! Please open an issue or submit a pull request if you have any improvements, bug fixes, or new features to add.

License This project is licensed under the MIT License.