npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@j-o-r/prompt

v1.1.5

Published

Message prompt template for various AI models

Downloads

42

Readme

@j-o-r/prompt

Message prompt template for various AI models.

Table of Contents

Introduction

@j-o-r/prompt is a JavaScript library designed to create a uniform prompt/message format for AI models. It simplifies working with messages and conversations across different AI platforms and LLMs (Large Language Models). The structure is inspired by the OpenAI messages format but is not directly compatible.

Features

  • Context Management: Supports context window for managing prompt length.
  • Multi-modal Messages: Handles text, image URLs, function requests, and function responses.
  • Token Counting: Counts the number of tokens in a string.
  • Message Export/Import: Export and import messages as JSON strings.
  • Sticky Messages: Supports sticky messages that persist across resets.
  • Record Keeping: Maintains records for billing and optimization.

Installation

To install the library, use npm:

npm install @j-o-r/prompt

Usage

Basic Example

import Prompt from '@j-o-r/prompt';

// Create a new Prompt instance
const prompt = new Prompt(1024); // contextWindow size of 1024 tokens

// Add a text message
prompt.add('user', 'Hello, how can I help you today?');

// Export messages to JSON
const exportedMessages = prompt.export();

// Import messages from JSON
prompt.import(exportedMessages);

// Get the current messages
const messages = prompt.messages;

console.log(messages);

Multi-modal Example

import Prompt from '@j-o-r/prompt';

const prompt = new Prompt(1024);

// Add a multi-modal message
const multiModalMessage = [
  { type: 'text', text: 'Here is an image for you:' },
  { type: 'image_url', image_url: 'https://example.com/image.png' }
];

prompt.addMultiModal('assistant', multiModalMessage);

console.log(prompt.messages);

API

Classes

Prompt

  • constructor(contextWindow?: number, str?: string)

    • Creates a new Prompt instance.
    • contextWindow: The max size of a total prompt (default is 0 for ONE_SHOT).
    • str: JSON formatted string representing the initial messages.
  • get messages(): PRMessage[]

    • Returns a copy of the messages array.
  • contentToString(content: PRContent[]): string

    • Returns the readable content (type == text).
  • countTokens(str: string): number

    • Counts the number of tokens in a given string.
  • export(): string

    • Exports the messages as a JSON string.
  • import(str: string): void

    • Imports messages from a JSON string.
  • truncate(): boolean

    • Reduces the prompt length to fit in the contextWindow or resets if no contextWindow.
  • get length(): number

    • Gets the length of the messages array.
  • get system_prompt(): string

    • Gets the system prompt from the messages.
  • get hasSystemprompt(): boolean

    • Checks if a system prompt is available.
  • add(role: string, message: string, sticky?: boolean): void

    • Adds a text message to the messages array.
  • addMultiModal(role: string, messages: PRContent[], sticky?: boolean): void

    • Adds a multi-modal message to the messages array.
  • reset(): string

    • Removes all non-sticky messages and returns them as a JSON string.
  • get records(): PRRecord[]

    • Returns a copy of the records array.
  • addRecord(record: PRRecord): void

    • Registers a record.
  • addRecords(records: PRRecord[]): void

    • Adds a list of records.
  • resetRecords(): PRRecord[]

    • Empties/reset prompt records and returns all records.

License

This project is licensed under the Apache 2.0 License.