npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

llm

v1.0.7

Published

Manipulate any language model from the command line.

Downloads

24

Readme

llm: language models usage made easy

Manipulate any language model from the command line.

From simple to advanced usage.

$ llm "Hello world."
Hello there! How can I assist you today?

Leave a ⭐ star to support the project.

Models

Some models are still being added. This is a work in progress.

| Model Name | Status | Description | |------------------------------|--------|-------------------------------------------------| | EVERY OpenAI model | ✅ | | | gpt-3.5-turbo | ✅ | ChatGPT | | gpt-4 | ✅ | GPT-4 via API (waitlist) | | text-davinci-003 | ✅ | InstructGPT (GPT-3) | | llama2 | ✅ | Meta's Llama 2 | | bing-chat | ✅ | Bing Chat: creative, balanced, precise | | bert | ✅ | BERT by Google | | llama-7b-hf | ✅ | Famous llama model | | wizardlm-13b-uncensored | ✅ | WizardLM 30B | | guanaco-65b-gptq | ✅ | Guanaco 65B | | bard | 🔄 | Google Bard | | ... HuggingFace 🤗 models | ✅ | every text-generation model |

Other models can be installed using the --install command.

Features

| Feature | Status |Comment | |-------------------------|-----------------|---| | Prompt | ✅ |Prompt model with default parameters| | Parameterization | ✅ |temperature, max-length, top-p, top-k, ...| | ChatGPT Plugins | 🔄 | Use chatGPT plugins. web-pilot working, global plugin system in development| | Use files | ✅ |Query models using prompt files| | Prompt chaining | ✅ |Call prompts like functions| | Prompt templating | 🔄 |Use variables in prompt files |

Getting started

git clone https://github.com/snwfdhmp/llm && cd llm
yarn install

make an alias llm

alias llm="node $(pwd)/main.js"

add it to your .bashrc or .zshrc to make it permanent.

You're ready to go ! Try:

$ llm "Hello world"
$ llm -m bing-creative "Tell me a joke"
$ llm -m gpt-3.5-turbo "Tell me a joke"

Usage

Simple prompting with defaults parameters

$ llm "what is the meaning of life?"

Use a specific model

$ llm -m bing-creative "find project ideas to learn react"

Use custom parameters

$ llm --max-length 512 --temperature 1 --top-p 0.9 --top-k 60 "follow the instructions."

List available models

$ llm ls
Name                LastUsedAt     Author      Description
text-davinci-003    2021-10-10     OpenAI      InstructGPT by OpenAI
gpt-3.5-turbo       2021-10-10     OpenAI      ChatGPT by OpenAI
gpt-4-web           2021-10-10     OpenAI      GPT-4 by OpenAI via chatGPT
llama               2021-10-10     Meta        Meta's Llama
bard                2021-10-10     Google      Google Bard
...

Use files as prompts

$ llm -f ./prompt.txt

Incoming:

  • Conversation system (remember past messages)
  • Install 3rd party models
  • Chaining
$ llm -s session_name "what is the meaning of life?"
remembers past messages
$ llm --install github.com/snwfhdmp/llm-descriptor-llama
downloads model from github

Add any model

Any model can be plugged into llm using a model descriptor.

Example of a model descriptor which requires installation

kind: llm/descriptor/v1
metadata:
    name: llama
model:
    install: |
        git clone ...
        cd ...
        ./install.sh
        # or
        docker pull ...
        # or
        none
    usage:
        ./model-executor -f model.bin $LLM_PARAM_PROMPT
    parameters:
        LLM_PARAM_PROMPT:
            type: string
            description: The prompt to use
            default: "Hello world"
        LLM_PARAM_MAX_TOKENS:
            type: int
            description: The maximum length of context
            default: 100
        LLM_PARAM_TEMPERATURE:
            type: float
            description: The temperature of the model
            default: 0.7

Example of a model descriptor which uses an API

kind: llm/descriptor/v1
metadata:
    name: llama
model:
    install: |
        read -p "Enter your API key:" LLM_API_KEY
        echo "LLM_API_KEY=$LLM_API_KEY" >> ~/.bashrc
    usage: curl -s $LLM_PARAM_API_TARGET_URL -d "prompt=$LLM_PARAM_PROMPT&api_key=$LLM_API_KEY"
    parameters:
        LLM_PARAM_API_TARGET_URL:
            type: string
            description: The URL of the API
            default: "https://api.llm.com"
        LLM_PARAM_PROMPT:
            type: string
            description: The prompt to use
            default: "Hello world"
        LLM_PARAM_MAX_TOKENS:
            type: int
            description: The maximum length of context
            default: 100
        LLM_PARAM_TEMPERATURE:
            type: float
            description: The temperature of the model
            default: 0.7

Env variables

These variables can be used to tweak llm behavior.

  • LLM_DEFAULT_MODEL - The default model to use when no model is specified
  • LLM_ENABLED_PLUGINS - A comma-separated list of plugins to enable
  • OPENAI_ORGANIZATION_ID - The organization ID to use for OpenAI models

Roadmap

Project vision and information can be found in docs.

Contributing

Contribute easily by leaving a ⭐ star to the project.

Code contributions are welcome. Please open an issue or a pull request.

Join the team at discord.gg/ccDghPeAT9.