llm-exe
v2.0.0-beta.12
Published
Simplify building LLM-powered apps with easy-to-use base components, supporting text and chat-based prompts with handlebars template engine, output parsers, and flexible function calling capabilities.
Downloads
4,136
Maintainers
Readme
llm-exe
A package that provides simplified base components to make building and maintaining LLM-powered applications easier.
- Write functions powered by LLM's with easy to use building blocks.
- Pure Javascript and Typescript. Allows you to pass and infer types.
- Support for text-based (llama-3) and chat-based prompts (gpt-4o, claude-3.5).
- Supercharge your prompts by using handlebars within prompt template.
- Allow LLM's to call functions (or call other LLM executors).
- Not very opinionated. You have control on how you use it.
See full docs here
Install
Install llm-exe using npm.
npm i llm-exe
import llmExe from "llm-exe"
// or
import { /* specific modules */ } from from "llm-exe"
Basic Example
Below is simple example:
import {
useLlm,
createChatPrompt,
createParser
} from "llm-exe";
const instruction = `<some prompt>
Your response must be formatted like:
<subtask>
<subtask>
<subtask>`;
const llm = useLlm("openai.chat.v1",{ /* options */ });
const prompt = createChatPrompt(instruction).addUserMessage()
const parser = createParser("listToArray");
const executor = createLlmExecutor({
llm,
prompt,
parser
})
const input = "Hello! When was my last appointment?";
const response = await executor.execute({ input })
/**
*
* The prompt sent to the LLM would be:
* (line breaks added for readability)
*
* [{
* role: 'system',
* content: '<some prompt>\n
* Your response must be formatted like:\n<subtask>\n<subtask>\n
* <subtask>'
* },
* {
* role: 'user',
* content: 'Hello! When was my last appointment?'
* }]
*
*/
/**
*
* console.log(response)
* [
* "a subtask the llm generated",
* "a subtask the llm generated",
* "a subtask the llm generated",
* ]
* /