npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

unified-llm-api

v1.0.11

Published

"One Class to rule them all"

Downloads

17

Readme

Unifed LLM API

"One Class to rule them all"

Unified LLM API is a simple way to implement a LLM to your Node JS project.

Usage

Install

Install package in your project:

npm i unified-llm-api

Import

import { LLM } from "unified-llm-api";

Initialize

const model = new LLM({
  model: "LLM model name",
  APIkey: "Your API key for specified LLM",
});

Avaliable models:

  • Google:
    • "gemini-1.5-pro-latest"
  • OpenAI:
    • "gpt-3.5-turbo"
    • "gpt-3.5-turbo-0125"
    • "gpt-3.5-turbo-16k"
    • "gpt-4"
    • "gpt-4-turbo"

Make a request

Example:

const main = async () => {
  const response = await model.generateContent({
    prompt: "What's the meaning of life?",
  });
  console.log(response);
};

main();

Available methods

  1. generateContent({prompt: string, systemMessage?: string}) => Promise<string | null>

    Retrieves a single response from the model based on the provided prompt.

    Parameters:

    • prompt — The user's input prompt for the model.
    • systemMessage (optional) — Optional text instructions guiding the behavior of the language model.

    Returns the model's response to the given prompt.

  2. chat({prompt: string, systemMessage?: string, history?: GeneralMessage[], onHistoryChange: (history: GeneralMessage[]) => void}) => Promise<string | null>

    Initiates a chat conversation with the Language Model (LLM). LLM maintains the conversation context and tracks chat history.

    Parameters:

    • prompt — The user's input prompt for the model.
    • systemMessage (optional) — Optional text instructions guiding the behavior of the language model.
    • history (optional) — Optional chat history. Allows to have a full control over chat history on your own.
    • onHistoryChange(optional) — Optional callback function for interaction with internal chat history after each chat history change.

    Returns the model's response to the provided chat prompt.

Chat history

To simplify the chat interactions with LLM Unified API handles chat history for you so you don't have to deal with it. However, there are also a few methods to interact with chat history because at least you have to cear it on your own :)

Available methods

  1. get();

    Retrieves a chat history as an array of messages:

    const chatHistory = model.history.get();

    Return type:

     Array<{role: string, content: string}>

    which itself is a

    GeneralMessage[]
  2. clear();

    Clears a chat history array:

    model.history.clear();

Nota bene

Internal chat history is not protected from server crashes or reboots. Thus, if you want to keep it safe please consider to pass your own chat history array as a prop to chat() method as long as it has it as an optional prop OR you also have an option to pass a callback function as a prop to chat() to carry out manipulations with internal chat history after each history change, for example to cache chat history or save it on your side.

Enjoy!