npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@e-llm-studio/streaming-response

v1.0.8

Published

stream messages from eLLM

Downloads

1,084

Readme

eLLM Studio Chat Stream Package

Welcome to the eLLM Studio Chat Stream package! 🎉 This package enables streaming chat functionality with your AI assistant in eLLM Studio via WebSocket and GraphQL. It's designed for both frontend and backend implementations.

🚀 Features

  • Real-time Streaming: Receive messages in a streaming fashion.
  • AI Assistant Integration: Seamlessly connects with the AI models deployed within your organization.
  • Customizable: Pass custom prompts or previous messages to enhance conversations.
  • Error Handling: Catch and manage errors using callbacks.
  • Image Upload: Upload images to the server.
  • Image Deletion: Delete previously uploaded images from the server.

📦 Installation


npm i @e-llm-studio/streaming-response

🛠️ Streaming Usage

Here’s how you can use the startChatStream function to set up the AI chat stream:

import { startChatStream } from '@e-llm-studio/streaming-response';

const params = {
    WEBSOCKET_URL: 'wss://your-ellm-deployment/graphql', // Required: WebSocket URL for your deployment
    organizationName: 'TechOrg',                         // Required: Organization name where the assistant is created
    chatAssistantName: 'MyAssistant',                    // Required: Assistant name
    selectedAIModel: 'gpt-4',                            // Required: AI model selected for the assistant
    replyMessage: '',                                    // Optional: Pass the previous response from AI.
    userName: 'John Doe',                                // Required: Username of the person using the assistant
    userEmailId: '[email protected]',                     // Required: User's email
    userId: 'user-123',                                  // Required: Unique identifier for the user
    query: 'What is the weather like today?',            // Required: The user's question or prompt
    requestId: `requestId-${uuid()}`,                    // Required: Unique ID for the request
    customPrompt: 'Personalized prompt here',            // Optional: Add custom context to your prompt
    enableForBackend: false,                             // Optional: Set to true if you're using it on the backend (Node.js)
    images: { 'image1': 'ImageName1.jpg', 'image2': 'ImageName2.jpg' } // Optional: Pass image data if relevant.
    isDocumentRetrieval:true,                            // To bypass the document retrieval.
    onStreamEvent: (data) => console.log('Stream event:', data),  // Required: Callback for handling stream data
    onStreamEnd: (data) => console.log('Stream ended:', data),    // Optional: Callback for when the stream ends
    onError: (error) => console.error('Stream error:', error),    // Optional: Callback for handling errors
};

startChatStream(params);

🔑 Parameters

Required Parameters:

  • WEBSOCKET_URL: WebSocket URL of your eLLM deployment. Example: wss://dev-egpt.techo.camp/graphql.
  • organizationName: Name of the organization where the assistant is created.
  • chatAssistantName: Name of the assistant you're interacting with.
  • selectedAIModel: The AI model used (e.g., GPT-4).
  • userName: The name of the user interacting with the assistant.
  • userEmailId: Email ID of the user.
  • userId: Unique user ID.
  • query: The question or prompt you want to send to AI.
  • requestId: Unique request ID, e.g., requestId-${uuid()}.
  • onStreamEvent: Callback function to capture incoming stream events.

Optional Parameters:

  • replyMessage: If you want to include a previous response with the new query, pass it here. Leave empty for normal chat scenarios.
  • customPrompt: Use this to add additional context to the prompt sent to the AI.
  • images: Pass image data if relevant.
  • isDocumentRetrieval: To bypass the document retrieval.
  • enableForBackend: Set to true if you're using this package in backend e.g. NodeJs. Defaults to false, which is suitable for frontend use e.g. React/Next.js.
  • onStreamEnd: Callback for when the stream ends. Useful for handling final events or cleanup.
  • onError: Callback for capturing any errors during the stream.

🛠️ Image Upload

You can upload an image to the server using the uploadImage function:

import { uploadImage } from './index';

const imageFile = document.querySelector('input[type="file"]').files[0];

uploadImage(baseURL, imageFile)
  .then((message) => console.log(message))
  .catch((error) => console.error(error));

🛠️ Image Deletion

To delete an image from the server, use the deleteImage function:

import { baseURL, name, deleteImage } from './index';

deleteImage('User Name', 'image.jpg')
  .then((message) => console.log(message))
  .catch((error) => console.error(error));

👥 Community & Support

For any questions or issues, feel free to reach out via our GitHub repository or join our community chat! We’re here to help. 😊