npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@motorro/firebase-ai-chat-vertexai

v4.0.2

Published

A library to run VertexAI chats on Firebase

Downloads

12

Readme

Firebase VertexAI chat library

Check npm version

OpenAI chat library. See top-level documentation for complete reference.

VertexAI setup

We also need to set up the VertexAI API. To do this we need prepare the model, wrapper and instructions bound to the ID or your chat assistant config:

import {projectID} from "firebase-functions/params";
import {VertexAI} from "@google-cloud/vertexai";
import {factory} from "@motorro/firebase-ai-chat-vertexai";

// Chat component factory
const chatFactory = factory(firestore(), getFunctions(), region);

// Create VertexAI adapter
const ai = chatFactory.ai(model, "/threads")

// Your assistant system instructions bound to assistant ID in chat config
const instructions: Readonly<Record<string, VertexAiSystemInstructions<any>>> = {
    "yourAssistantId": {
        instructions: instructions2,
        tools: {
            dispatcher: (dataSoFar, name, args) => dataSoFar,
            definition: [
                {functionDeclarations: [{name: "someFunction"}]}
            ]
        }
    }
}

const options: CallableOptions = {
  secrets: [openAiApiKey],
  region: region,
  invoker: "public"
};

Optional custom message mapper

By default, the library uses only text messages as-is. But if you want custom message processing, say image processing or adding some metadata, you could create your own AI message processor and supply it to chat factory worker method. Default message mapper could be found here.

const myMessageMapper: VertexAiMessageMapper = {
    toAi(message: NewMessage): Array<Part> {
        throw new Error("TODO: Implement mapping from Chat to OpenAI");
    },

    fromAi(message: GenerateContentCandidate): NewMessage | undefined {
        throw new Error("TODO: Implement OpenAI to Chat message mapping")
    }
}

Custom resource cleaner

When you close the chat, the library cleans up the threads that were created during the chat. If you need any custom processing, you may add some custom cleaner that will be called along:

/**
 * Chat resource cleaner
 */
const cleaner: ChatCleaner = {
    /**
     * Schedules cleanup commands stored inside chat data
     * @param chatDocumentPath Chat document
     */
    cleanup: async (chatDocumentPath: string): Promise<void> => {
        logger.d("Cleanup");
    }
}

Optional middleware

By default, the library saves all the messages that come from AI. If you need any custom processing, you could add some custom AI message middleware. Take a look at the main documentation for details here.

const handOver: MessageMiddleware<CalculateChatData, CalculatorMeta> = chatFactory.handOverMiddleware(
    "calculator",
    handOverProcessor
);

Command dispatcher configuration

The requests to front-facing functions return to user as fast as possible after changing the chat state in Firestore. As soon as the AI run could take a considerable time, we run theme in a Cloud Task "offline" from the client request. To execute the Assistant run we use the second class from the library - the VertexAiChatWorker. To create it, use the AiChat factory we created as described in the main documentation.

To register the Cloud Task handler you may want to create the following function:

import {onTaskDispatched} from "firebase-functions/v2/tasks";
import {firestore} from "firebase-admin";
import {getFunctions} from "firebase-admin/functions";

// Function region
const region = "europe-west1";
// Collection path to store threads
const VERTEXAI_THREADS = "treads";

export const calculator = onTaskDispatched(
    {
      retryConfig: {
        maxAttempts: 1,
        minBackoffSeconds: 30
      },
      rateLimits: {
        maxConcurrentDispatches: 6
      },
      region: region
    },
    async (req) => {
      // Create and run a worker
      // See the `dispatchers` definitions below
      const vertexAi = new VertexAI({
          project: projectID.value(),
          location: region
      });
      const model = vertexAi.getGenerativeModel(
          {
              model: "gemini-1.0-pro",
              generationConfig: {
                  candidateCount: 1
              }
          },
          {
              timeout: 30 * 1000
          }
      );

      // Dispatch request  
      await chatFactory.worker(
          model, 
          VERTEXAI_THREADS, 
          instructions, 
          myMessageMapper, 
          cleaner, 
          [handOver]
      ).dispatch(
          req,
          (chatDocumentPath: string, meta: Meta) => {
             // Optional task completion handler
             // Meta - some meta-data passed to chat operation
          }   
      );
    }
);

The VertexAiChatWorker handles the VertexAiChatCommand and updates VertexAiChatState with the results.

Full example is available in the sample Firebase project.