npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@hyperflow/job-executor

v1.3.4

Published

![GitHub tag (latest SemVer pre-release)](https://img.shields.io/github/v/tag/hyperflow-wms/hyperflow-job-executor?include_prereleases&sort=date)

Downloads

109

Readme

HyperFlow Job executor

GitHub tag (latest SemVer pre-release)

This is a basic HyperFlow job executor that uses local directory path to read and write files, and Redis (or RabbitMQ/Redis) for communication with the HyperFlow engine.

Adding the executor to a Docker image

  • Install Node.js 12.x or higher
  • Install the executor package:
    • Latest version: npm install -g @hyperflow/job-executor
    • Specific version: npm install -g @hyperflow/[email protected]
    • From master branch: npm install -g https://github.com/hyperflow-wms/hyperflow-job-executor/archive/master.tar.gz

Running jobs (Redis only)

Jobs can be run with either of the following commands:

  • hflow-job-execute <taskId> <redisUrl>, where taskId is a unique job identifier, while redisUrl is an URL to the Redis server where the actual job command is fetched from. Both parameters are available in HyperFlow functions as context.taskId and context.redis_url, respectively.
  • hflow-job-execute <redisUrl> -a -- <taskId>... -- to run multiple jobs sequentially (useful for agglomeration of small jobs).

Jobs can be submitted e.g. using the HyperFlow function k8sCommand. See RemoteJobs example to learn more details.

Running jobs with AMQP listener

The AMQP listener can be run with the following command:

  • hflow-job-listener.js

Note that in order to run an executor following variables must be set with proper values:

  • RABBIT_HOSTNAME
  • QUEUE_NAME

More details about those configuration variables are in Configuration chapter.

Logging

The executor creates log files in directory <work_dir>/logs-hf that contain:

  • command used to execute the job
  • stdout and stderr of the job
  • metrics (CPU/memory/IO/network usage)
  • events (job start/end)
  • system information (e.g. hardware configuration)
  • all environment variables starting with HF_LOG_ -- a JSON object is logged following conventions from the read-env package

Configuration

The following environment variables can be used to adjust the behavior of the job executor:

  • HF_VAR_PROBE_INTERVAL (default 2000): time interval (in ms) at which to probe and log metrics.
  • HF_VAR_NUMBER_OF_RETRIES (default 1): how many times the job should be re-executed if it returns a non-zero exit code.
  • HF_VAR_BACKOFF_SEED (default 10): factor used in calculating the backoff delay between retries.
  • HF_VAR_WAIT_FOR_INPUT_FILES: if set to 1, the executor will check if input files exist and wait for them (useful in systems where files are synchronized in an eventually consistent fashion).
  • HF_VAR_FILE_WATCH_NUM_RETRIES (default 10): how many times should the executor check for existence of the input files (with backoff waits in between).
  • HF_VAR_WORK_DIR: path to the working directory where the job should be executed. If not set, /work_dir will be used if exists, otherwise the executor will not change the working directory.
  • HF_VAR_LOG_DIR: path to the directory where log files should be written. If not set, <work dir>/logs-hf will be used.
  • HF_VAR_LOG_LEVEL (default info): set logging level (trace, debug, info, warn, error, fatal).
  • HF_VAR_ENABLE_NETHOGS: if set (to any value), logs from nethogs will be written (experimental).
  • HF_VAR_DRY_RUN: (for testing/debugging) if set to 1, the executor will immediately return with job exit status 0 (success).
  • HF_LOG_*: all variables starting with HF_LOG_ will be logged in the job log files
  • RABBIT_HOSTNAME: RabbitMQ instance hostname. Can be supplied with basic auth credentials, for example: username:[email protected]. When no username nor password are specified, default RabbitMQ guest:guest credentials are used. Required in AMQP executor mode
  • QUEUE_NAME the name of the queue at which executor will wait for task messages. Required in AMQP executor mode

Releasing

For quick and dirty developer releases

# Commit your changes
make dev-release

To release a proper version:

# Commit your changes
# Use npm version <arg>, to tag your changes and bump npm version
make release