@locallm/api
v0.1.7
Published
An api to query local language models using different backends
Downloads
78
Readme
LocalLm api
An api to query local language models using different backends. Supported backends:
:books: Api doc
Install
npm install @locallm/api
Usage
Example with the Koboldcpp provider:
import { Lm } from "@locallm/api";
const lm = new Lm({
providerType: "koboldcpp",
serverUrl: "http://localhost:5001",
onToken: (t) => process.stdout.write(t),
});
const template = "<s>[INST] {prompt} [/INST]";
const _prompt = template.replace("{prompt}", "list the planets in the solar system");
// run the inference query
await lm.infer(_prompt, {
stream: true,
temperature: 0.2,
n_predict: 200,
});
Check the examples directory for more examples