npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@stone-js/pipeline

v0.0.46

Published

An implementation based on the Chain of Responsibility (aka CoR) design pattern.

Downloads

531

Readme

Stone.js: Pipeline

npm npm npm Maintenance Publish Package to npmjs Conventional Commits

An implementation based on the Chain of Responsibility (aka CoR) design pattern. In summary; the pipelines take a job, process it, and forward it to the next pipeline.


Synopsis

The Pipeline class is a versatile utility designed to manage and execute a series of operations on a set of input values through multiple configurable "pipes". Pipes can be either functions or class methods that process values sequentially. It allows for flexible configuration, including synchronous and asynchronous execution, custom method invocation on each pipe, and dependency injection through a resolver or container. The new resolver mechanism can be customized using PipelineOptions to provide different ways of resolving pipes.

Installation

The Pipeline library is available from the npm registry and can be installed as follows:

npm i @stone-js/pipeline

Yarn:

yarn add @stone-js/pipeline

PNPM:

pnpm add @stone-js/pipeline

[!NOTE] This package is Pure ESM. If you are unfamiliar with what that means or how to handle it in your project, please refer to this guide on Pure ESM packages.

Make sure your project setup is compatible with ESM. This might involve updating your package.json or using certain bundler configurations.

The Pipeline module can only be imported via ESM import syntax:

import { Pipeline } from '@stone-js/pipeline';

Getting Started

The Pipeline class allows you to send objects through a series of operations. It’s highly configurable and designed to work with dependency injection.

Compatibility with JavaScript and TypeScript

The Pipeline library is compatible with both TypeScript and vanilla JavaScript projects. While the examples provided are written in TypeScript for improved type safety and developer experience, you can also use Pipeline seamlessly in JavaScript environments without any modifications.

To use it in a JavaScript project, simply import the library as usual, and TypeScript types will not interfere. All TypeScript-specific features, such as type annotations, are optional and won't affect usage in JavaScript.

Here’s a simple example to get you started:

  1. Import the Pipeline class:

    import { Pipeline } from '@stone-js/pipeline';
  2. Create a new Pipeline instance:

    // Creating a basic pipeline
    const pipeline = new Pipeline<number>();
       
    // Setting up pipes (functions that will transform the passable value)
    const addOne = (value: number, next: (value: number) => number) => next(value + 1);
    const multiplyByTwo = (value: number, next: (value: number) => number) => next(value * 2);
       
    // Configure the pipeline
    pipeline.send(1).through([addOne, multiplyByTwo]).sync();
       
    // Run the pipeline and get the result
    const result = pipeline.thenReturn(); 
       
    console.log(result); // Output: 4

In the above example:

  • send(1) initializes the pipeline with a value of 1.
  • through([addOne, multiplyByTwo]) sets up the transformation functions (pipes).
  • sync(true) sets synchronous execution.
  • thenReturn() runs the pipeline, with the output being (1 + 1) * 2 = 4.

Configuring with PipelineOptions and Custom Resolver

The Pipeline class can now be configured using an options parameter called PipelineOptions, which allows you to pass a custom resolver to resolve the pipes. This enables greater flexibility in configuring how pipes are resolved and instantiated during pipeline execution.

Here's an example of how you can use the resolver to manage dependency resolution:

import { Pipeline, MixedPipe, PipeInstance, Passable } from '@stone-js/pipeline';

// Custom resolver function to resolve pipes
const customResolver = <T extends Passable, R extends Passable | T = T>(pipe: MixedPipe): PipeInstance<T, R> => {
  if (typeof pipe === 'function') {
    return new pipe() as PipeInstance<T, R>;
  }
  throw new Error(`Cannot resolve pipe: ${String(pipe)}`);
};

// Create a new pipeline instance with the custom resolver
const pipeline = Pipeline.create({
  resolver: customResolver,
});

// Configure and execute the pipeline
pipeline.send('customResolver')
  .through([(value: string, next: (val: string) => string) => next(value.toUpperCase())])
  .sync(true);

const result = pipeline.thenReturn();
console.log(result); // Output: "CUSTOMRESOLVER"

Usage

The Pipeline class provides an easy way to chain operations and execute them on an initial set of values. Below, you will find some typical usage patterns to help you get started.

Basic Configuration and Execution

Here is a simple usage example that demonstrates how to use the Pipeline class to send data through a series of transformations:

import { Pipeline } from '@stone-js/pipeline';

// Step 1: Create the pipeline instance
const pipeline = new Pipeline<string>();

// Step 2: Create a few pipes (transformation functions)
const toUpperCase = (value: string, next: (value: string) => string) => next(value.toUpperCase());
const addGreeting = (value: string, next: (value: string) => string) => next(`Hello, ${value}!`);

// Step 3: Set the initial passable value and add pipes to the pipeline
pipeline.send("world").through([toUpperCase, addGreeting]).sync(true);

// Step 4: Execute the pipeline and obtain the result
const result = pipeline.then((value) => value);

console.log(result); // Output: "Hello, WORLD!"

Asynchronous Pipeline

The Pipeline class also supports asynchronous pipes, allowing you to run asynchronous operations, such as fetching data from an API or performing an I/O operation.

import { Pipeline } from '@stone-js/pipeline';

// Step 1: Create the pipeline instance
const pipeline = new Pipeline<number>();

// Step 2: Create asynchronous pipes
const fetchData = async (value: number, next: (value: number) => Promise<number>) => {
  const fetchedValue = await mockApiFetch(value);
  return next(fetchedValue);
};

const mockApiFetch = async (value: number): Promise<number> => {
  return new Promise((resolve) => {
    setTimeout(() => resolve(value * 10), 1000);
  });
};

// Step 3: Configure the pipeline
pipeline.send(5).through([fetchData]);

// Step 4: Execute the pipeline asynchronously and get the result
const result = await pipeline.thenReturn();

// Output after 1 second: 50
console.log(result);

Dependency Injection with Custom Resolver

The new resolver approach allows you to manage the resolution of pipes more flexibly than using a container. Below is an updated example that uses a custom resolver instead of a Container.

import { Pipeline, PipeResolver, Passable } from '@stone-js/pipeline';

// Create a custom resolver
const resolver: PipeResolver<Passable, Passable> = (pipe) => {
  if (typeof pipe === 'function') {
    return new pipe() as any; // Create an instance from the function pipe
  }
  throw new Error(`Pipe could not be resolved: ${pipe}`);
};

// Set up the pipeline with the resolver
const pipeline = Pipeline.create({
  resolver,
});

// Use the pipeline
pipeline.send('example').through([
  {
    pipe: (value: string, next: (value: string) => string) => next(value.toLowerCase()),
  },
]);

const result = pipeline.thenReturn();
console.log(result); // Output: "example"

Customizing Execution Method

The pipeline also allows customization of the method to call on each pipe using the via() method.

import { Pipeline } from '@stone-js/pipeline';

class CustomPipe {
  transform(value: string, next: (value: string) => string): string {
    return next(value.split('').reverse().join(''));
  }
}

const pipeline = new Pipeline<string>();

pipeline.send('pipeline')
  .through([new CustomPipe()])
  .via('transform') // Set method to 'transform'
  .sync(true);

const result = pipeline.thenReturn();
console.log(result); // Output: "enilepip"

Pipe Executor Order

The Pipeline class allows you to control the order in which pipes are executed using the priority attribute in the MetaPipe configuration. Each pipe in the pipeline can be assigned an optional priority level, which determines its execution order.

MetaPipe Configuration

The MetaPipe interface represents a configuration object for pipes, which includes the pipe to execute, optional parameters, and a priority level:

export interface MetaPipe {
  /** The pipe to execute, which can be a function or a string identifier. */
  pipe: Pipe;
  /** An optional array of parameters to pass to the pipe. */
  params?: unknown[];
  /** An optional priority level of the pipe. */
  priority?: number;
}
  • pipe: The pipe to execute, which can be either a function or a string identifier.
  • params: Optional parameters that are passed to the pipe during execution.
  • priority: An optional number that specifies the priority level of the pipe. Pipes are executed in order of their priority, with lower values indicating higher priority.

Setting Pipe Priorities

When adding pipes to the pipeline, you can assign different priority levels to control their execution order. By default, all pipes have the same priority level, but you can adjust these values to ensure certain operations are performed before others.

For example:

import { Pipeline, MetaPipe } from '@stone-js/pipeline';

// Create pipes with different priority levels
const pipe1: MetaPipe = {
  pipe: (value: number, next: (value: number) => number) => next(value + 1),
  priority: 1, // High priority, executed first
};

const pipe2: MetaPipe = {
  pipe: (value: number, next: (value: number) => number) => next(value * 2),
  priority: 2, // Lower priority, executed after pipe1
};

// Create a new pipeline and configure it with prioritized pipes
const pipeline = new Pipeline<number>();
pipeline.send(1).through([pipe2, pipe1]).sync(true);

// Execute the pipeline
const result = pipeline.thenReturn();
console.log(result); // Output: 4 (1 + 1, then multiplied by 2)

In this example, pipe1 is executed first because it has a higher priority (priority: 1), while pipe2 is executed afterward (priority: 2). By default, if no priority is provided, all pipes are treated equally and executed in the order they are added.

Summary

The Pipeline class now offers additional flexibility in how you manage the lifecycle of the pipes. You can provide a custom resolver to determine how each pipe is instantiated or resolved before it is executed. This allows for both dependency injection and straightforward function-based pipes, making it suitable for a wide variety of use cases.

API documentation

Contributing

See Contributing Guide.

Credits