@andrao/llm-client
v1.0.1
Published
[![npm version](https://badge.fury.io/js/@andrao%2Fllm-client.svg)](https://badge.fury.io/js/@andrao/llm-client) ![build](https://github.com/andrao/llm-client/workflows/CI/badge.svg)
Downloads
3
Readme
🤖 @andrao/llm-client
This repo provides a single interface for interacting with LLMs from Anthropic, OpenAI, Together.ai, and, locally, Ollama.
Primary exports
| Function | Description |
| ------------------- | -------------------------------------- |
| runChatCompletion
| Interoperable chat completion function |
Secondary exports
| Function | Description |
| -------------------- | ------------------------------------------------ |
| getAnthropicClient
| Lazy-init an Anthropic SDK client |
| getOllamaClient
| Lazy-init an Ollama client via the OpenAI SDK |
| getOpenAIClient
| Lazy-init an OpenAI SDK client |
| getTogetherClient
| Lazy-init a Togeter.ai client via the OpenAI SDK |