fxn-llm
v0.0.3
Published
Use local LLMs in your browser and Node.js apps.
Downloads
7
Maintainers
Readme
Function LLM for JavaScript
https://github.com/user-attachments/assets/86ae6012-264e-437f-9ab8-94408f4105ba
Use local LLMs in your browser and Node.js apps. This package is designed to patch OpenAI
and Anthropic
clients for running inference locally, using predictors hosted on Function.
[!TIP] We offer a similar package for use in Python. Check out fxn-llm.
[!IMPORTANT] This package is still a work-in-progress, so the API could change drastically between all releases.
[!CAUTION] Never embed access keys client-side (i.e. in the browser). Instead, create a proxy URL in your backend.
Installing Function LLM
Function LLM is distributed on NPM. Open a terminal and run the following command:
# Run this in Terminal
$ npm install fxn-llm
[!IMPORTANT] Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.
Using the OpenAI Client Locally
To run text generation and embedding models locally using the OpenAI client, patch your OpenAI
instance with the locally
function:
import { locally } from "fxn-llm"
import { OpenAI } from "openai"
// 💥 Create your OpenAI client
let openai = new OpenAI({ apiKey: "fxn", dangerouslyAllowBrowser: true });
// 🔥 Make it local
openai = locally(openai, {
accessKey: process.env.NEXT_PUBLIC_FXN_ACCESS_KEY
});
// 🚀 Generate embeddings
const embeddings = openai.embeddings.create({
model: "@nomic/nomic-embed-text-v1.5-quant",
input: "search_query: Hello world!"
});
[!WARNING] Currently, only
openai.embeddings.create
is supported. Text generation is coming soon!
Useful Links
- Discover predictors to use in your apps.
- Join our Discord community.
- Check out our docs.
- Learn more about us on our blog.
- Reach out to us at [email protected].
Function is a product of NatML Inc.