npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@doc-gpt/chat

v1.1.7

Published

Typed client for interacting with the OpenAI Chat Api

Downloads

26

Readme

Doc GPT - Chat SDK

Typed SDK for the OpenAI Chat Api

Use the new OpenAI Chat Api /v1/chat/completion with ease.

Stream request are supported

Note: OpenAI Chat Api is still in beta, and may change very quickly. If you notice some changes in the api that has not been implemented, please open an issue or propose a PR.

For more informations about the underlay api, please refer to OpenAI Chat Api Docs

Install

The library is avaliable both on NPM and GitHub Packages.

Install from NPM

npm i @doc-gpt/chat

Install from GitHub Packages

First of all, you need to add a .npmrc file, in the root of your project, with the following content:

@doc-gpt:registry=https://npm.pkg.github.com

Then install the package

npm install  @doc-gpt/[email protected]

Getting started

DocGptChat offers a simple api to interact with OpenAI Chat Api. You just need to configure the instance and start chatting with the methods avaliable:

  • Chat(messages, options)
  • SimpleChat(messages, options)
  • ChatStream(messages, options, onMessageDelta, onDone, onError)

Note: If you use a static api key, it's better to use an environment variable.

Import

import DocGptChat from '@doc-gpt/chat';

Configure the instance

const gpt = new DocGptChat({
  // Required,
  apiKey: OPENAI_API_KEY, // If static, it's highly recommended to use environment variable
  // Optional
  defaultModel: GptModels['gpt-3.5-turbo-0301'], // or simply as string (default gpt-3.5-turbo)
  // Optional
  defaultSystemMessage: 'You are ChatGPT...',
});

Simple usage example:

If you just need the first message content from the response, it's avaliable the SimpleChat method, which wraps Chat with the same parameters and returns the first message string.

try {
  // Get a full response from the api
  const message = await gpt.SimpleChat([
    {
      role: 'user',
      content: prompt,
    },
  ]);

  // Use the message
} catch (err) {
  // Handle Api errors
}

Basic usage example:

If you just need the full response from the api, it's avaliable the Chat method.

try {
  // Get a full response from the api
  const res = await gpt.Chat([
    {
      role: 'user',
      content: prompt,
    },
  ]);

  // Handle the response

  // For example:
  // Get the first message object
  const firstMessage = res.choices[0].message;
} catch (err) {
  // Handle Api errors
}

Stream usage example:

If you want to use stream response, you can use ChatStream method.

// Message to increment
let message = '';

// Get a full response from the api
gpt
  .ChatStream([
    {
      role: 'user',
      content: prompt,
    },
  ])
  .onMessage((messageDelta, fullResponse) => {
    // Increment the response message
    message = message + messageDelta;
  })
  .onDone(() => {
    // Use the full message
    console.log(message);
  })
  .onError((err) => {
    // Handle errors
  });

Methods

Chat

  // Get a full response
  public async Chat(
    messages: GptMessage[],
    options?: GptChatOptions
  ): Promise<GptResponse>

SimpleChat

  // Get the first choice message content.
  public async SimpleChat(
    messages: GptMessage[],
    options?: GptChatOptions
  ): Promise<GptResponse>

Types

All the avaliable message roles

// Message Roles
type GptRole = 'system' | 'user' | 'assistant';

Chat Message

// Chat Message
interface GptMessage {
  content: string;
  role?: GptRole;
}

Response

// OpenAI `/v1/chat/completion` api response
interface GptResponse {
  id: string;
  object: string;
  created: number;
  choices: GptResponseChoices[];
  usage: GetResponceUsage;
}

// Response Choice
interface GptResponseChoices {
  message: GptMessage;
  index: number;
  finish_reason: string;
}

// Response Choice structure with stream mode
export interface GptResponseStreamChoices {
  delta: GptMessage; // delta instead of message
  index: number;
  finish_reason: string;
}

// Response Usage
interface GetResponceUsage {
  prompt_tokens: number;
  completion_tokens: number;
  total_tokens: number;
}

Utils

There is a const named GptModels you can import in order to refer to avaliable models. If a new model is release, and you want to try it, you can still pass any string to model option.

export const GptModels = {
  'gpt-3.5-turbo': 'gpt-3.5-turbo' as GptModel,
  'gpt-3.5-turbo-0301': 'gpt-3.5-turbo-0301' as GptModel,
};

// Example
const model = GptModels['gpt-3.5-turbo-0301'];

Dependencies

Axios

This library depends on axios library.

See axios-http website
See axios on NPM
See axios/axios repository

TextLineStream class

This library has copied the class TextLineStream from denoland/deno_std (MIT) project.

See Deno website
See denoland/deno_std repository


Take a look at the file THIRD_PARTY_LICENCES for the full licences.

Notes

This project is created and mantained by @doc-packages.
If you find a bug or have a suggestion, an Issue is very welcome!

Copyright (c) 2023-present, Francesco Bellini