glou
v1.1.3
Published
Complete pipelining system compatible with Gulp
Downloads
16
Readme
Glou
Glou is a complete pipelining system compatible with Gulp.
With glou, you can:
- create reusable pipelines, as easily as you usually create JavaScript variables
- pipe them into each others
- configure or extend them to create inherited pipelines
- combine them with parallel or serial executions, while keeping the flow's content in the order you wish
- have powerful watchers, enabling partial rebuild to update your software milliseconds after save
A simple example:
Gulpfile.js
:
var argv = require('yargs').argv;
var glou = require('glou');
var gulpIf = require('gulp-if');
var gulpSourcemaps = require('gulp-sourcemaps');
var gulpUglify = require('gulp-uglify');
var gulpConcat = require('gulp-concat');
var env = {
sourcemaps: argv.dev && !argv.ie8, // sourcemaps in dev mode
uglify: !argv.dev || !argv.ie8
};
var buildjs = glou
.pipe('sourcemaps init', gulpIf, env.sourcemaps, gulpSourcemaps.init)
.pipe('uglify', gulpIf, env.uglify, gulpUglify)
.pipe('sourcemaps write', gulpIf, env.sourcemaps, gulpSourcemaps.write)
.remember()
;
var js = glou
.src(['libs/jquery/jquery.min.js', 'src/**/*.js'])
.pipe(buildjs)
.pipe('concat', gulpConcat, 'body.min.js')
.dest('build')
;
glou
.task('js', js)
.watch('src/**/*.js', 'js')
;
In this example, you can see two pipelines:
buildjs
that minifies whatever comes into it using gulp-uglify while creating sourcemaps only ifenv.sourcemaps
istrue
(thanks to gulp-if conditions). Remember is used to allow fast rebuild. See documentation below.js
that accepts source files with.src()
, then pipes them intobuildjs
, then pipes the result into gulp-concat to join the file's contents, then writes the result in the directorybuild
.
The glou.task()
method is similar to gulp.task
, but it takes a pipeline as a second argument instead of a stream.
Finally, the watcher watches the js
files beneath src
, and runs the changed files through the js
task.
Introduction
This little example introduces the main methods used by glou.
In reality, those are much much more powerful, and you should definitely check their documentation's out in order to write more advanced build systems.
A bigger example, building a complete demo-app using AngularJS, Bootstrap, Stylus, and a Node back-end server using Express, Jade and i18next, is available in this repository.
Installation
First you will need gulp:
npm install -g gulp
You need to use the gulp executable to run glou tasks. There is no glou executable (currently, this is in the roadmap for v2).
Then install glou:
npm install glou
Note: you need to use gulp.task()
to register a default task or create aliases for tasks.
Changelog
You can access the CHANGELOG.md to know what changed recently.
Documentation
Pipelines
You can see a pipeline as a big meta stream. It is a function you can call that always return a transform stream (meaning you can pipe into it and pipe it into something else).
You can create a more complex pipeline from one by calling any of its pipelining methods
(configure
, src
, pipe
, parallel
, dest
, remember
, append
and prepend
).
To create a new pipeline, you just have to call any pipelining method
directly on the glou
object.
// a new pipeline:
var sourcesPipeline = glou.src('*.js');
// another new pipeline:
var destPipeline = glou.dest('build');
// another new pipeline, that uses the previous ones:
var jsPipeline = sourcesPipeline.pipe(destPipeline);
// a pipeline may have any length:
var pipeline = sourcesPipeline
.pipe(gulpJshint)
.pipe(gulpJscs)
// […]
.pipe(gulpUglify)
.pipe(gulpConcat)
;
Note: Because pipelines are meta streams, you never give them streams directly but functions that return a stream instead. This allows glou to instanciate the needed streams multiple times whenever it is required (glou knows ^_^).
The methods used in this example: src
, dest
and pipe
will be explained further on.
Advanced usage: A pipeline can simply be converted into a stream by calling it as a function:
var pipeline = glou
.pipe(TransformStreamFn)
.pipe(TransformStreamFn2)
;
var duplexStream = pipeline(); // that was easy!
This function can take an option object that will be used to configure the pipeline (see .configure()
below for more information):
pipeline({option: 'value'})
Source: .src([options,] sources)
A pipeline may start with a .src()
call in order to add sources flowing down.
However, you may call .src()
anywhere you want in a pipeline if you want to add sources at a particular step.
By default, the new sources are appended to the stream.
sources
Type: string
| array<string>
| function
The sources files to load and add to the stream.
If a function is provided it will be called at each pipeline execution and the returned value will be used as sources. It should return a string
or an array<string>
.
options
Type: object
(optional)
This method takes an optional first argument: options
. It is an object provided to gulp-src
, and internally, to node-glob and glob-stream.
For more information, please refer to gulp-src
's documentation.
Additional values that are used by glou
:
options.prepend:
Type: boolean
(optional)
Prepend the files in the stream, instead of appending them.
options.onChange:
Type: string
(optional)
On partial rebuild, src
is not behaving the same way that for a full build. It only emits the created/changed files (but not the deleted files) if they match the glob pattern. This can be customized using the following values:
- all: all files are reemitted (as if it was a full rebuild)
- none: no files are reemitted
- allIfNothing: all files are reemitted if the changed files are not in the provided glob patterns (useful to emit all tests if a src file is changed for example)
- changed (default value): only the files that changed are reemitted if they are in the provided glob patterns ; otherwise nothing is emitted
var pipeline = glou.src(
{base: 'client'},
['app/**/*.js', 'lib/**/*.js']
);
var otherPipeline = glou
.src('app/**/*.js') // add application's sources
.pipe('jshint', gulpJshint) // lint them
// add library sources to the stream before the app sources
.src({prepend: true}, 'lib/**/*.js')
.pipe('concat', gulpConcat) // concat everything
;
Note: Please, do not use gulp-add-src
and use .src()
as glou's src()
method makes the partial build after watch and the prepend option possible.
options.failOnNoMatch
Type: boolean
(optional)
Defaults to true
. Make the build fail if the provided glob pattern does not match anything and the build is not incremental. Set to false
to disable this behaviour.
Dest: .dest([options,] path)
A pipeline may end with a .dest()
call in order to write the streaming files to the filesystem.
However, you may call .dest()
anywhere you want in a pipeline if you want to write the files at a particular step: like gulp's .dest()
, it re-emits all data passed to it so you can continue the pipeline further on.
path
Type: string
| function
The path (output folder) to write files to. Or a function that returns a file path for each files. In that case, the function will be provided a vinyl File
instance each time a file goes through the dest stream.
options
Type: object
(optional)
This method takes an optional first argument: options
. It is an object provided to gulp-dest
.
For more information, please refer to gulp-dest
's documentation.
glou
.src('*.js') // takes all js files in the current directory
.dest('build') // copy them in the "build" directory
;
var bakupPath = '/www/backups'
glou
.dest('app') // outputs in app
.dest({cwd: bakupPath}, 'app') // & outputs in /www/backups/app
;
Using .dest(path)
is very similar to using .pipe(gulp.dest, path)
, but please use this method as we may enhance its behavior in the future (plus it is shorter to write!).
.dest(path)
takes glou.watch()
into account: when a file is deleted, transformed files are deleted too. Example:
glou
.task('indexHtml', glou
.src('index.html')
// If you delete index.html, it will be deleted from app.
.dest('app')
)
.watch('index.html', 'indexHtml')
;
This feature works when the file name is transformed (e.g. index.jade
→ index.html
), or when the file is moved around. If you use .dest(path)
in a generic pipeline (with a function as the path), all corresponding generated files will be removed, regardless of the task that has been run. If you want finer control, you will need to use gulp.dest
.
options.log
Type: boolean
(optional)
Defaults to true
. If set to false
, suppress logs about written files.
Pipe: .pipe([streamName,] [options,] streamFn, [args...])
.pipe()
is the heart of glou's pipelines.
It takes a stream returning function streamFn
(including pipelines since they are functions returning streams). The stream produced above in the pipeline will be piped in the stream returned by streamFn
during instanciation.
streamFn (or pipeline)
Type: function
A function that returns a stream. You usually want to use a gulp plugin here.
You also may want to craft a transform by yourself using through2, but you would usually prefer to export this function in another file, module or package in order to keep things modular.
var through = require('through2');
glou.pipe('remove sourcemaps', function() {
return through.obj(function(file, enc, cb) {
file.contents = new Buffer(
// sourcemaps are on one line and begin with //#
file.contents.toString(enc).replace(/^\/\/#.*?$/gm, ''),
enc
);
cb(null, file);
});
});
streamName
Type: string
(optional)
The pipe's name (helpful to display debug and visually see what's going on in this pipe).
options
Type: object
(optional)
options.fn: By setting options.fn
to false
, functions given as parameters to pipe won't be autoinvoked anymore (see args below).
options.array: By setting options.array
to false
, arrays with a function in their first cell won't be invoked (see args below).
options.obj: By setting options.obj
to false
, glou.config
calls won't be resolved in objects offering better performances (see args below).
options.error: By setting options.error
to 'warn'
, you can raise warnings instead of errors.
Note: Warning-mode explicitely extracts this pipe from the pipeline, so that it won't be plugged on a .pipe
beneath.
Let's explain why: if an error is raised or emitted in your stream's execution, it's impossible to continue the flow as the result may be completely incoherent. For example, maybe you really need that file that failed later after concatenating the stream's flow…
So, if an error occurs, it is properly catched by glou, that crashes the running task with a nice error message.
However sometimes, you may need to use plugins that do not alter your files, and for those allowing errors to be catched and emit warnings would be great!
A fitting example would be gulp-jscs. JSCS is a code style checker, that checks how developers write code to ensure that everyone writes under the same norm on a project (a very useful plugin indeed!).
However, having the build fail while working when an inappropriate space character is left at the wrong place is a bit annoying. This is where you want to use the {error: 'warn'}
option:
var argv = require('yargs').argv;
glou
.pipe('jshint', gulpJshint)
.pipe('jscs', {error: argv.dev ? 'warn' : 'fail'}, gulpJscs)
.pipe(jsBuild)
;
Here, in dev-mode only, jshint
will be piped directly into both jscs
and jsBuild
. Errors happening in the jscs
orphan branch (meaning its ouput will be discarded) will be emitted as warnings at the end of the execution of the task, thus effectively preventing jscs
from making the build fail.
args...
Type: any
(optional)
This is advanced usage, use with extreme care
You often need to provide your stream functions or plugin arguments. In this case, you would normally be forced to wrap your plugin's call in a function:
// pass gulpSourcemaps.init function without arguments:
glou.pipe('sourcemaps init', gulpSourcemaps.init);
// pass a function that calls gulpSourcemaps.init with arguments:
glou.pipe('sourcemaps init', function() {
return gulpSourcemaps.init({loadMaps: true});
})
In order to keep things convenient and concise, the args...
list is passed to the function:
glou.pipe('sourcemaps init', gulpSourcemaps.init, {loadMaps: true});
But what if I need to execute a function as an argument? All functions are invoked before hand so:
glou.pipe('sourcemaps init', gulpIf, env.sourcemaps, gulpSourcemaps.init);
// is the same as
glou.pipe('sourcemaps init', function() {
return gulpIf(env.sourcemaps, gulpSourcemaps.init());
})
And what if I also want to call gulpSourcemaps with an argument (like {loadMaps: true}
? You can do that by creating partial calls using arrays:
glou.pipe('sourcemaps init', gulpIf, env.sourcemaps,
[gulpSourcemaps.init, {loadMaps: true}]);
// this is the same as
glou.pipe('sourcemaps init', function() {
return gulpIf(env.sourcemaps, gulpSourcemaps.init({loadMaps: true}));
})
You can wrap as many arrays/functions/… in each other as you want:
glou.pipe('sourcemaps init', gulpIf, env.sourcemaps,
[gulpSourcemaps.init, [_.constant({loadMaps: true})]]);
Nice! But what if I wanted to pass an array or a function as an argument? Well, the case is extremely rare as plugins usually take object arguments. Since all arguments that are a function are immediately executed to retrieve their result as an argument, you can wrap your arrays (or function) using a constant call or you can use the fn
, array
& obj
options:
function constant(val) {
return function() {
return val;
};
} // if you are using lodash, there is _.constant doing exactly that
glou.pipe('a plugin', gulpPlugin,
constant([function() {}, 'one', 'two']),
constant(myFunc)
);
// or with options:
glou.pipe('a plugin', {fn: false, array: false}, gulpPlugin,
[function() {}, 'one', 'two'],
myFunc
)
Note: Please keep in mind that those features are nice-to-have provided to keep things as concise as possible.
If you stumble upon a complex plugin call, better just simply enclose it in a function (see streamFn
):
glou.pipe('my complex plugin', function() {
return myComplexPlugin({
val: myComplexPlugin.helper('toto'),
// …
});
});
Note: Since pipelines can be instanciated using an option object (see configure
& config
below), you can also pass it using the arg
notation:
glou.pipe('my pipeline', pipeline, pipelineOptions);
Multiplexer: .parallel([streamName,] [pipelines... | pipelineFn])
, .serie([streamName,] pipelines...)
In a complex build system, you often want to merge streams. This is especially true with glou, with which you can create reusable pipelines. The component to achieve that is called a multiplexer.
The difference between parallel
and serie
is the parallelization of the pipelines' execution.
streamName
Type: string
(optional)
The multiplexer's name (helpful to display debug and visually see what's going on here).
pipelines...
Type: streamFn
(may also be an array<streamFn>
in .parallel
)
The multiplexer takes several streamFn (or pipelines, as usual), and outputs one.
If you pipe a pipeline into a multiplexer, it is piped into all of its inputs.
The results of the given pipelines can be merged in order (as arguments following each other) or not (as arrays).
var jsClientLib = glou; // fake pipeline building libraries files
var jsClientApp = glou; // fake pipeline building app files
function jsClientFile(type) {
// we want the libraries to appear BEFORE the app files,
// so we execute parallel with a list of arguments
return glou
.parallel(
glou
.src(config.js.client.lib[type])
.pipe('jsClientLib', jsClientLib),
glou
.src(config.js.client.app[type])
.pipe('jsClientApp', jsClientApp)
)
.pipe('concat', gulpConcat, type + '.min.js'))
.dest('build/app/')
;
}
// we can execute head and body in parallel without order,
// as each of them output their own dest file
var jsClient = glou.parallel([jsClientFile('head'), jsClientFile('body')]);
// we want to execute jsClient entirely before starting the
// jsServer pipeline, so we use `.serie`
var app = glou.serie(jsClient, jsServer);
Note: Merging in order may be slower (files flowing from a second stream would have to wait for the first in order to continue flowing down).
Note: Obviously, .serie
always merges in order.
When merging in parallel, you have more performances but the order of the files in the stream is random while merging in series lower performances but ensure that files from the first pipeline will be merged before files of the second.
Note: .serie
buffers everything that flows from the pipeline in order to pipe everything in the provided pipelines. This may lead to a freezes if you stream too many files inside a serie as one of the streams may pause
. In this case, you should review your flow as it may not be ideal:
var pipelineProducingALotOfFiles = glou /* … */;
// Here all files will try to be loaded in memory while they are piped in
// checkFiles. Once checkFiles is done, assembleFiles will be instanciated
// and all files in memory will be also piped into assembleFiles' instance.
// Once assemblesFiles is done, uploadFiles will be instanciated & all files
// piped again in uploadFiles' instance…
var mightFreeze = pipelineProducingALotOfFiles
.serie(checkFiles, assembleFiles, uploadFiles)
;
pipelineFn
You can't pass a single pipeline in parallel (what would that mean?). So if you pass a single function, it is assume it is a pipelineFn
. A pipelineFn
should return the pipelines
when invoked. It takes the config
as this
.
Here is a simple example:
var dynamicParallelCall = glou.parallel('this is dynamic!', function() {
var tasks = [/* copy task */];
if (this.config('reduceTask'))
tasks.push(/* reduce task */);
return tasks;
});
prepend / append: .append([streamName,] [options, ], pipeline[, …args])
Those methods prepend/append the result of another pipeline before/after the files of the current one.
var myPipeline = glou
.pipe(someProcess)
.append(someProcess2)
;
Arguments of prepend/append are similar to pipe arguments (see the related documentation) and are used to initialize the provided pipeline:
var myPipeline = glou
.pipe(someProcess)
.append(pipeOptions,
someProcess2,
someProcess2Arg1,
...
)
;
The previous code is similar to (this code is hard to understand!):
var myPipeline = glou
.pipe(someProcess)
.parallel(
$.noop,
glou.plugins.swallow() // swallow eats files without emitting them
.pipe(pipeOptions, someProcess2, someProcess2Arg1 /* ... */)
)
;
Note: The pipeline provided will execute in parallel with the one in which append
or prepend
is called. The only difference is the order of the files after the step.
Configure: .configure(config)
As we have seen before, when you instanciate a pipeline you can give it a configuration object (aka options
). If you don't provide one, an "empty" object is used instead (some internals may be attached to it anyway).
The configuration flows down into sub pipelines automatically. What we call sub pipelines here are both subcalls like currentPipeline.dest(…)
(here dest will get the configuration in currentPipeline) and pipelines invoked using pipe like currentPipeline.pipe(pipeline)
(pipeline will also be initialized with the configuration of currentPipeline).
Calling configure
merges the provided config
with the current pipeline's configuration.
config
Type: object
| function
The configuration object to merge into the current pipeline's configuration.
If it is a function the configuration object to be merged to the current one is awaited in return.
this
in callbacks
The configuration object is made accessible as this
in all callbacks of glou.
var writeResult = glou
.pipe(gulpConcat, function() {
return this.filename;
})
.dest('build/')
;
glou
.configure({filename: 'test.js'})
.src('src/*.js')
.pipe(writeResult)
;
this.config(key [,defaultValue])
in callbacks
The configuration object has a config
convenience getter that checks for the provided key in the object and throws if it does not exists. You can pass a second argument with a default value to get it instead of thowing:
var writeResult = glou
.configure({filename: 'test.js'})
.pipe(gulpConcat, function() {
return this.config('filename');
})
.pipe(function() {
// throws because 'sub.key' does not exist
// note: it won't throw if 'sub.key' is defined in an upper pipeline
return myPlugin(this.config('sub.key'))
})
.pipe(function() {
// returns 'default' because 'sub.key' does not exist
return myPlugin(this.config('sub.key', 'default'))
})
;
key
Type: string
(optional)
The key of the value to retrieve in the configuration object.
Advanced usage : If you don't specify the key, the complete configuration object will be returned. This can be useful if you want to pass such data into a plugin.
defaultValue
Type: string
(optional)
Sometimes, it is useful to have a default value. So like this.config()
in streamFns
, glou.config
also takes a default value working the same way.
Config: .config(key [,defaultValue])
Config is a helper that prevents you to create functions in a lot of situations.
var jsClientLib = glou
.pipe('a plugin', gulpPlugin, glou.config('key'));
;
// this is the equivalent of
var jsClientLib = glou
.pipe('a plugin', gulpPlugin, function() {
return this.config('key');
})
;
// which is also equivalent to
var jsClientLib = glou
.pipe('a plugin', function() {
return gulpPlugin(this.config('key'));
})
;
Arguments are the equivalents of those of this.config(key [,defaultValue])
.
In this example, we transform Jade files into HTML templates, and we pass to jade's globals the entire configuration object while concat only gets the filename.
var templates = glou
.src(config.templates)
.pipe('jade', gulpJade, glou.config())
.pipe('concat', gulpConcat, glou.config('filename'))
;
// this is the same pipeline
var templates = glou
.src(config.templates)
.pipe('jade', gulpJade, function() {
return this;
})
.pipe('concat', gulpConcat, function() {
return this.config('filename');
})
;
Even better, in .pipe()
calls, object keys and subkeys can contain a glou.config
call:
var templates = glou
.src(config.templates)
.pipe('jade', gulpJade, {pipelineConfig: glou.config(/* … */)})
;
Task: .task(name, pipeline)
A glou task internally creates a gulp task, so you can call it from the command line:
$ gulp jsClient
A task in glou is very similar than a task in gulp, except it handles the proper start of a pipeline.
Indeed, a glou pipeline is only composed of transform streams, in opposition to gulp that needs to start with a call to gulp.src()
, a readable steam.
This method plugs correctly the embedded errors handling system, so please use it.
var jsClient = glou; // FIXME: create pipeline
glou
.task('jsClient', jsClient)
;
name
Type: string
The task name.
pipeline
Type: string
The pipeline to run.
Run: .run(pipeline)
This is a proper function to start a pipeline without gulp
.
It configures the pipeline properly to handle errors. This will probably be deprecated in the future but right now this is mandatory.
glou.run(glou.src('*.js').dest('build'));
pipeline
The pipeline to run.
Watch: .watch(sources, tasks)
Glou embeds watchers in order to rebuild your app as soon as you saved a file in your editor.
One particularity of glou's watchers is that they restart a pipeline that will only run with the changed files. This capability, combined with .remember()
(documented further), allows you to update your build in a matter of milliseconds.
glou
.task('jsClient', jsClient)
.watch(['app/**/*.js', 'app/templates/**/*.tpl.html'], 'jsClient')
;
sources
Type: string
or array<string>
The path globs of the files to watch.
tasks
Type: string
or array<string>
The tasks to start upon change. All the pipelines launched by these tasks will have their .src()
emit nothing, except the modified files.
Remember: .remember([name])
Remember is one of the most powerful tools of glou. It is MANDATORY (at least in most cases) to add calls to remember()
at the appropriate places in glou for watch
to work properly.
Indeed, as discussed above, watch
restarts the pipeline with only the changed files being emitted by the different sources. This means that you need to be able to reassemble all your files at some point. Remember allows you to do just that.
Here is what remember does:
# first pass
fileA -> remember
fileB -> remember
[end event] -> remember
rembember -> fileA, fileB
# second pass
fileB (modified) -> remember
[end event] -> remember
remember -> fileA, fileB (modified)
# third pass
fileB (modified 2x) -> remember
fileA (modified) -> remember
[end event] -> remember
remember -> fileA (modified), fileB (modified 2x)
As you can see, remember
"remembers" everything it sees passing through and reemits everything it once saw (even in the past) in the same order. The modified files it receives are updated in its cache before being reemited at the end at the "correct" place.
Note: Because new files will always be seen last, they will also be emitted last. This might not be a correct behavior for your use case. You can use gulp-order right after your remember call to get rid of this problem.
glou
takes care of making remember
forget about deleted files automatically when a delete
event is received. The resulting behaviour is:
# first pass
fileA -> remember
fileB -> remember
[end event] -> remember
rembember -> fileA, fileB
# second pass (fileA deleted)
[glou ask remember to forget about fileA]
[end event] -> remember
remember -> fileB
Note: glou
uses gulp-remember-history
internally. Do not use it or gulp-remember
since it won't be "glou
aware".
name
Type: string
| function
optional
The name allows you to select a cache for remember
to work in. By default each call to remember()
creates a unique cache which is fine. But sometimes you create a pipeline in which remember
is generic and will be used by multiple files that are not meant to be used together. In this particular case, you will have to specify a name to make it work:
var assemble = glou
.pipe('jscs', gulpJscs)
.pipe('uglify', gulpUglify)
.remember(glou.config('filename')) // the filename is used as cache name
.pipe('concat', gulpConcat, glou.config('filename'))
.dest('build/')
;
glou
.configure({filename: 'lib1.min.js'})
.src(config.lib1.src)
.pipe(assemble)
;
glou
.configure({filename: 'lib2.min.js'})
.src(config.lib2.src)
.pipe(assemble)
;
The previous example would quickly cause lib1
and lib2
files to be sent both to lib1.min.js
and to lib2.min.js
if remember
was using a unique cache. Here, we asked to put files flowing each stream in a different cache depending on where the files are going to be ouput.
Advanced usage:
Also, using a named cache can allow you to purposefully share a cache between two pipelines (a real use case for this eludes me):
var task1 = glou
.pipe(…)
.remember('task1.step')
.pipe(…)
;
var task2 = glou
.remember('task1.step')
// here I get every files seen by task1 at the marked step so it acts a
// little like src but with another pipeline's files
.pipe(…)
;
Performance
We measured that the total overhead of glou is <10ms on a complex build (the angular.app example available in the repository).
Since the library merely helps creating pipelines, when the streams are plugged together, no overhead at all remains.
Tests
The library is currently covered by more than 200 unit tests that reach 100% code coverage (reported by Istanbul).
Library and test code also pass JsHint and JSCS.
Contribution
You are welcome to create pull requests if you want to correct a bug or if you developed a nice new little feature.
However, prior to submission, we ask you to:
- Write new tests for any change
- Run and pass the test suite (
gulp test --coverage
). Code coverage must stay at 100%. - Update the documentation accordingly
- Respect the commit message format (see the revisions graph to get an idea)
Thank you!
License
This package is licensed under the terms of the MIT license.