npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@vonage/video-effects

v1.0.0

Published

Library to add video effects

Downloads

1,117

Readme

Vonage Video API - Video Effects

The Vonage Video Effect library adds effects to a camera video stream that you can use with Vonage Video API.

The main feature is the background blur applied to the person's background

Live Example

You can see a working example: https://vonage-video-effect-sample-app.herokuapp.com/

Prerequisites

Installation

NPM

You can install directly from npm.

npm install @vonage/video-effects

Using this method, you can import vonage-video-effects like so:

import * as VideoEffects from '@vonage/video-effects';

Script tag

You can also copy vonage-video-effects.js from the dist/build folder and include it directly in your web app using a <script> tag.


<script src="https://my-server-path/vonage-video-effects.js"></script>

Using this method, vonage-video-effects.js will set a browser global:

const VideoEffects = OT.VideoEffect;

Build the library locally

You can build the library locally following these steps:

  1. npm install
  2. npm run build-umd
  3. cd sample-app and npm install. The app server will start on port 3000

Assets

The VideoEffect library uses WebAssembly to run TensorFlow Lite for person segmentation. You need to serve the tflite model and binaries so they can be loaded properly. These files can be downloaded from the assets folder.

These effects run TensorFlow Lite using MediaPipe Selfie Segmentation Landscape Model and requires Chrome's WebAssembly SIMD support in order to achieve the best performance. WebAssembly SIMD can be turned on by visiting chrome://flags on versions 84 through 90. This will be enabled by default on Chrome 91+.

Background Effect Processor

The BackgroundEffectProcessor allows you to dynamically add different background effects, enabling and disabling them and making sure everything is applied to the stream being published by the OpenTok video API.

| Option | Type | Required | Description | ----------- | ----------- | --- | ----------- | | assetsPath | String | Yes | The path where the assets are loaded. These files can be downloaded from the dist/build folder. | | frameRate | Number| No | The frames per second (fps) to use for the output stream. (default 30)

Example:

const effectProcessor = new BackgroundEffectProcessor({
    assetsPath: 'https://your-server-url.com/assets',
    maskBlurRadius: 5,
    frameRate: 30
});
await effectProcessor.loadEffect(someEffect);

// pause effect processing
effectProcessor.pauseStreamProcessing(true);
// set and dynamically change the input stream the effects are applied to
effectProcessor.setInputStream(yourCameraSteam);
// enable / disable the effect. If disabled the input stream will simply be forwarded to the output
effectProcessor.enableEffect(true);
// the output stream that has the effect applied to it and will dynamically change based on the properties above.
effectProcessor.outputStream;

Currently, the BackgroundEffectProcessor allows you to use the following 2 effects.

Background Blur Effect

The BackgroundBlurEffect applies a blur filter on the background in each video frame and leaves the person untouched. The constructor has the following options:

| Option | Type | Required | Description | ----------- | ----------- | --- | ----------- | | blurFilterRadius | Number| No | The background blur filter radius to use in pixels. (default 15) | maskBlurRadius | Number| No | The blur radius to use when smoothing out the edges of the person's mask (default 5)

Example:

const effectProcessor = new BackgroundEffectProcessor({...});
await effectProcessor.loadEffect(new BackgroundBlurEffect({
    blurFilterRadius: 15
}));

Virtual Background Effect

The Virtual Background Effect applies a virtual background in each video frame and leaves the person untouched. The constructor has the following options:

| Option | Type | Required | Description | ----------- | ----------- | --- | ----------- | | virtualBackground.backgroundType | String | Yes | Type of virtual background. The only supported type is "image". | virtualBackground.backgroundImage | HTMLImageElement | Yes | The HTMLImageElement representing the current background image. | fitType | Number| No | The ImageFit for positioning of the background image in the viewport. Available options: "Contain", "Cover", "Fill" or "None" | maskBlurRadius | Number| No | The blur radius to use when smoothing out the edges of the person's mask (default 5)

Example:


async function loadImage(name) {
    return new Promise((resolve) => {
        const image = document.createElement('img');
        image.crossOrigin = 'anonymous';
        image.src = `backgrounds/${name}.jpg`;
        image.onload = () => resolve(image);
    });
};

const effectProcessor = new BackgroundEffectProcessor({...});
const backgroundImage = await loadImage('vonage_background');
await effectProcessor.loadEffect(new VirtualBackgroundEffect({
    virtualBackground: {
        backgroundType: 'image',
        backgroundImage
    }
}));

The library adds effects on a MediaStream object. The developer needs to create the MediaStream and pass it to the library. You can see an example on the previous paragraph.

Bringing it all together

A simple example for a full integration with the opentok video API.

import OT from "@opentok/client"
import {
    BackgroundEffectProcessor,
    BackgroundBlurEffect,
    VirtualBackgroundEffect,
} from "@vonage/video-effects";


async function getLocalMedia() {
    try {
        return await OT.getUserMedia({audioSource: null, videoSource: true});
    } catch (err) {
        console.error('OTGetUserMedia - err', err);
    }
}

async function createBackgroudEffectProcessor() {
    const effectProcessor = new BackgroundEffectsProcessor({assetsPath: 'https://your-server-url.com/assets'});
    effectProcessor.setInputStream(await getLocalMedia());
    return effectProcessor;
}

async function publishToSession(effectProcessor) {
    // Create the publisher and publish into the session
    let publisher = OT.initPublisher(
        'publisher',
        {
            videoSource: effectProcessor.outputStream.getVideoTracks()[0],
            width: 640,
            height: 480,
            insertMode: 'append'
        },
        (err) => {
            console.log('Publisher Created');
        }
    );
    session = OT.initSession(apikey, sessionId);
    session.on('streamCreated', (stream) => {
        session.subscribe(stream, (err) => {
            if (err) {
                console.log('Error while subscribing', stream);
            }
            console.log('Subscribed to ', stream);
        });
    });
    session.on("streamPropertyChanged", event => {
        if (event.stream == publisher.stream) {
            if (event.changedProperty === "hasVideo") {
                // make sure we pause effect processing while we are not publishing video
                effectProcessor.pauseStreamProcessing(!event.event.newValue);
            }
        }
    });
    session.connect(token, (err) => {
        if (err) {
            console.log('Error while connecting to the session');
        }
        console.log('Session Connected');
        session.publish(publisher, (errPublisher) => {
            if (errPublisher) {
                console.log('Error while publishing into the session');
            }
            console.log('Successfully published the stream');
        });
    });

    return publisher;
}


// putting it all together
const effectProcessor = await createBackgroudEffectProcessor();
await effectProcessor.loadEffect(new BackgroundBlurEffect());
const publisher = await publishToSession(effectProcessor);

// enable disable the effect
function toggleBackgroundEffect() {
    effectProcessor.enableEffect(!effectProcessor.effectEnabled);
}

// dynamically switch to blur effect
async function switchToBlurEffect(blurRadius) {
    await effectProcessor.loadEffect(new BackgroundBlurEffect({blurFilterRadius: blurRadius}));
}

// dynamically switch to background image effect
async function switchToImageEffect(imageUrl) {
    const image = await new Promise((resolve) => {
        const img = document.createElement('img');
        img.crossOrigin = 'anonymous';
        img.onload = () => resolve(img);
        img.src = imageUrl;
    });
    await effectProcessor.loadEffect(new VirtualBackgroundEffect({
        virtualBackground: {
            backgroundType: 'image',
            backgroundImage: image
        }
    }));
}

// change the input stream e.g. when switching to a different camera
function switchCamera(newCameraStream) {
    effectProcessor.setInputStream(newCameraStream);
}

// clean up everything
function destory() {
    effectProcessor.destroy();
    publisher.destroy();
}