npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@laurent22/biniou

v1.0.22

Published

Task automation utility

Downloads

40

Readme

Biniou

Biniou is a local, event-driven job scheduler and automation framework. You can run commands that are triggered according to various conditions, and pipe them. The jobs can be started on a schedule, based on events generated by other jobs, and based on local file system changes.

The jobs can be defined based on the included templates or, for maximum flexibility, written in JavaScript. In that case you do any processing you want and dispatch the events in the format of your choice.

This utility is mostly designed as an offline tool running on your own computer, as an alternative to simpler "cron" tasks and less heavy than IFTTT or Huginn when you just want simple scripts running under various conditions locally.

Usage

  • Install biniou globally, for example using npm install -g @laurent22/biniou
  • You then need to let it run in the background, using biniou start

Creating a job

  • To create a job, create a folder in ~/.config/biniou/jobs - the folder name will be the job ID.
  • Then add a job.json file in that folder - this will be used to describe the job

Running a job on a schedule

A simple job.json file can be something like this:

{
	"type": "shell",
	"script": "/path/to/your/command.sh",
	"trigger": "cron",
	"triggerSpec": "0 * * * *"
}

This is essentially a simple cron command. In some context, it is easier than using the system crontab (especially on macOS) since the above command will run in the same environment as your user, which means paths, env variables, etc. will be defined.

Running a job when file or folder is changed

{
	"type": "js",
	"trigger": "fileSystem",
	"triggerSpec": {
		"paths": ["/path/to/watch"],
		"depth": 0,
		"fileExtensions": ["txt"]
	}
}

The above job will watch /path/to/watch for any change. It will not do so recursively since depth is set to 0, and only the text file (.txt) will be watched.

Then the index.js file can be used to do something when a file is changed:

exports = {
	run: async (context) => {
		const params = context.params;
		const event = params.event;
		const path = params.path;

		if (event === 'add') {
			console.info('A file has been added: ' + path);
			// In this example, each file that is changed is copied to a backup folder
			await biniou.execCommand(['cp', path, '/backup/']);
		}
	},
};

See the Chokidar documentation for the list of supported events. The most useful ones are probably add, change and unlink.

Running a daemon

You may want to have biniou starts a process once and let it run in the background - for example if you are launching a script that already handles watching files, or that needs to be running all the time watching some resource.

In that case you can simply start it as daemon. Then it's up to you to handle how this script works - stdout and stderr will be printed to biniou's output but otherwise there will be no error handling or reporting.

{
	"type": "shell",
	"script": "/path/to/your/daemon.sh",
	"trigger": "daemon"
}

Dispatching events

A job can be setup to dispatch events. Events can be used by other jobs for further processing, thus creating a pipe.

To dispatch events, you need to create your script in JavaScript. For example, the following job will parse an RSS feed, and create an event for each RSS item:

job.json:

{
	"type": "js",
	"trigger": "cron",
	"triggerSpec": "0,30 * * * *"
}

index.js:

exports = {
	run: async () => {
		const feed = await biniou.rssParser({
			customFields: {
				item: ['twitter-text'],
			}
		}).parseURL('https://joplinapp.org/rss');

		const events = feed.items.map(item => {
			return {
				id: item.guid,
				title: item.title,
				content: item.content,
				link: item.link,
				twitterText: item['twitter-text'],
				date: item.isoDate,
			};
		});

		await biniou.dispatchEvents('joplin_post', events, { allowDuplicates: false });
	},
};

Note in particular the biniou.* function - several of these are available for various tasks to make it easier to define jobs.

Processing events

Once a job has dispatched events, you will probably want to process these events in some way. For this you can create an event processing job.

Likewise you would create a job.json file but this type the trigger will be event. You also set the triggerSpec to the event types that your job can process. There can be any number of event types in that array.

job.json:

{
	"type": "js",
	"trigger": "event",
	"triggerSpec": ["joplin_post"]
}

You then create an index.js that's going to process the event. This file should contain a run function which you use to receive and process the event. In the example below an email is going to be sent for each event:

index.js:

exports = {
	run: async function(context) {
		const mailerConfig = {
			host: "smtp.example.com",
			port: 465,
			secure: true,
			auth: {
				user: '[email protected]',
				pass: 'Moscow4',
			},
			tls: {
				rejectUnauthorized: false
			}
		};

		const content = JSON.parse(context.event.body); 

		await biniou.mailer(mailerConfig).sendMail({
			from: '"Biniou" <[email protected]>',
			to: '[email protected]',
			subject: '[Biniou] ' + content.title,
			text: context.event.body,
		});
	},
};

Job structure

The job.json file defines the basic job metadata:

| Property | Type | Possible Values | Description | |------------------|-------------------|--------------------|-------------| | id | string | | Unique identifier for the job. This is the name of the folder that contains job.json | | type | JobType | 'js', 'shell' | Type of the job. | | trigger | JobTrigger | 'cron', 'event', 'fileSystem', 'daemon' | Defines when the job should be triggered. | | triggerSpec | JobTriggerSpec | string, string[] or FileSystemTriggerSpec | Specification of the trigger. It can be a single string or an array of strings (e.g., Cron expressions or event identifiers). If the job type is fileSystem, it should be a FileSystemTriggerSpec (see below). | | script | string | | The script to be executed. Defaults to index.js for js jobs. Otherwise you need to provide the path to the shell command. | | enabled | boolean | true or false | Indicates whether the job is enabled and should run. Defaults to true | | template | string | | Optional template name, if the job is based on a template. | | depth | number | | If the job type is fileSystem this can be used to specify how deep the folders should be watched. undefined means all folders recursively. 0 means only the current folder, 1 is one level deep, etc. | | params | any | | Additional parameters that can be used for the job execution. |

FileSystemTriggerSpec

| Property | Type | Possible Values | Description | |------------------|-------------------|--------------------|-------------| | paths | string[] | | List of paths that must be watched - can be either files or directories | | depth | number | | Optional. This can be used to specify how deep the folders should be watched. undefined means all folders recursively. 0 means only the current folder, 1 is one level deep, etc. | | fileExtensions | string[] | | Optional. File extensions that must be watched. Leave it empty to watch all files. |

The biniou API

Within a JavaScript job, you can access the biniou global object that provide various utilities function. The full list of functions is described in JobSandbox.ts

Development

Run yarn watch to automatically build the application.

To run just one command, for testing, use:

yarn start run NAME_OF_JOB

NAME_OF_JOB will be the name of the folder in ~/.config/biniou/jobs

License

MIT