npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

ai-maestro-edge

v1.0.0

Published

Edge server to control AI Docker containers

Downloads

3

Readme

AI Maestro Edge

Node service controled by ai-maestro-api.

npm install ai-maestro-edge npx ai-maestro-edge

Description

This application is a simple server built using Express that manages the lifecycle and model loading of Ollama or StableDiffusion containers running in Docker. It provides endpoints for creating, destroying, and managing the instances with specific GPUs and ports. The application also supports loading models into these containers.

Endpoints

  • POST /up-container: Creates a new instance by spinning up a Docker container with the specified name, GPU IDs, and port. If this is creating a stable diffusion container, you must also pass in diffusionModel with either 'sdxl-turbo' or 'sd-turbo', depending on which model you want the container to run. Returns a 200 status code upon success.
  • POST /down-container: Stops and removes an existing instance (Docker container) based on its name. If the model you are downing is a diffusion model, also pass in mode with the value "diffusion". Returns a 200 status code upon success.
  • POST /down-all-containers: Stops and removes all instances (Docker containers) - both diffusion models and LLMs. Returns a 200 status code upon success.
  • POST /load-model: Loads a specified model into a given container by running the 'ollama run MODEL' command for llms, and issues a request to a diffusion container to load that into VRAM. If the model you are downing is a diffusion model, also pass in mode with the value "diffusion". Returns a 200 status code upon success.
  • GET /health: A simple health check endpoint, which returns a 200 status code if the server is running.

Requirements

  • Node.js (>=18)
  • Docker
  • Ollama/ollama image in Docker

Installation and Usage

  1. Clone this repository: git clone <repository_url>
  2. Navigate to the project directory: cd <project_directory>
  3. Install dependencies: npm install
  4. Start the server: npm start or node main.ts
  5. Use an API client such as Postman, Curl, or Insomnia to send requests to the provided endpoints.