npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

deca-chat

v1.0.3

Published

A simple wrapper for OpenAI's chat API - that includes ability to use custome baseURL's

Downloads

278

Readme

DecaChat

A lightweight and easy-to-use wrapper for OpenAI's Chat API. DecaChat provides a simple interface for creating chat-based applications with OpenAI's GPT models.

Features

  • 🚀 Simple, intuitive API
  • 📝 TypeScript support
  • 💾 Conversation management
  • ⚙️ Configurable parameters
  • 🛡️ Built-in error handling
  • 🌐 Custom base URL support
  • 🔄 Conversation history management
  • 🤖 System message configuration
  • 📦 Zero dependencies (except OpenAI SDK)

Installation

npm install deca-chat

Quick Start

import { DecaChat } from 'deca-chat';

// Initialize the chat client
const chat = new DecaChat({
  apiKey: 'your-openai-api-key'
});

// Send a message and get a response
async function example() {
  const response = await chat.chat('Hello, how are you?');
  console.log(response);
}

Configuration

The DecaChat constructor accepts a configuration object with the following options:

interface DecaChatConfig {
  apiKey: string;      // Required: Your OpenAI API key
  model?: string;      // Optional: Default 'gpt-4o-mini'
  baseUrl?: string;    // Optional: Default 'https://api.openai.com/v1'
  maxTokens?: number;  // Optional: Default 1000
  temperature?: number; // Optional: Default 0.7
}

API Reference

Constructor

const chat = new DecaChat(config: DecaChatConfig);

Methods

setSystemMessage(message: string): void

Sets the system message for the conversation. This resets the conversation history and starts with the new system message.

chat.setSystemMessage('You are a helpful assistant specialized in JavaScript.');

async chat(message: string): Promise<string>

Sends a message and returns the assistant's response. The message and response are automatically added to the conversation history.

const response = await chat.chat('What is a closure in JavaScript?');

clearConversation(): void

Clears the entire conversation history, including the system message.

chat.clearConversation();

getConversation(): ChatMessage[]

Returns the current conversation history, including system messages, user messages, and assistant responses.

const history = chat.getConversation();

Example Usage

Basic Chat Application

import { DecaChat } from 'deca-chat';

async function example() {
  // Initialize with custom configuration
  const chat = new DecaChat({
    apiKey: 'your-openai-api-key',
    model: 'gpt-4',
    maxTokens: 2000,
    temperature: 0.8
  });

  // Set a system message
  chat.setSystemMessage('You are a helpful coding assistant.');

  try {
    // Start a conversation
    const response1 = await chat.chat('How do I create a React component?');
    console.log('Assistant:', response1);

    // Continue the conversation
    const response2 = await chat.chat('How do I add props to it?');
    console.log('Assistant:', response2);

    // Get conversation history
    const history = chat.getConversation();
    console.log('Conversation History:', history);
  } catch (error) {
    console.error('Error:', error);
  }
}

Custom API Endpoint

const chat = new DecaChat({
  apiKey: 'your-api-key',
  baseUrl: 'https://your-custom-endpoint.com/v1',
  model: 'gpt-4o-mini'
});

Managing Conversations

// Start with a system message
chat.setSystemMessage('You are a helpful assistant.');

// Have a conversation
await chat.chat('What is TypeScript?');
await chat.chat('How does it differ from JavaScript?');

// Get the conversation history
const history = chat.getConversation();
console.log(history);
/* Output:
[
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'user', content: 'What is TypeScript?' },
  { role: 'assistant', content: '...' },
  { role: 'user', content: 'How does it differ from JavaScript?' },
  { role: 'assistant', content: '...' }
]
*/

// Clear the conversation and start fresh
chat.clearConversation();

Error Handling

The chat method throws errors when:

  • The API key is invalid
  • The API request fails
  • Rate limits are exceeded
  • Network errors occur
  • Invalid model specified
  • Custom endpoint is unreachable

Always wrap API calls in try-catch blocks for proper error handling:

try {
  const response = await chat.chat('Hello');
  console.log(response);
} catch (error) {
  console.error('Chat error:', error);
}

Best Practices

  1. System Messages: Set appropriate system messages to guide the assistant's behavior
  2. Conversation Management: Clear conversations when starting new topics
  3. Error Handling: Always implement proper error handling
  4. Resource Management: Monitor token usage and conversation length
  5. API Key Security: Never expose your API key in client-side code

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

License

MIT License - see the LICENSE file for details.