ijs-ipc-json-stream
v1.0.2
Published
This module provides a convenient way to send/receive complex data objects over streams (pipes and sockets)
Downloads
5
Readme
Overview
This module provides a convenient way to send/receive complex data objects over streams (pipes and sockets). It does this by transparently serializing the data to JSON, and parsing it on the other side, emitting a json
event to your code whenever it has a complete JSON message.
The library handles all buffering for you, and so it will only emit one json
event for each completed JSON document, pre-parsed into a data object for your callback. And for sending data, you can pass it a complex object, which will be auto-serialized and streamed over the pipe or socket.
Usage
Use npm to install the module:
npm install ijs-ipc-json-stream
Then use require()
to load it in your code:
var JSONStream = require('ijs-ipc-json-stream');
To use the module, instantiate an object, and attach it to a stream:
var stream = new JSONStream( read_stream, write_stream );
Things like network sockets are both read and write, so you only need to pass in one argument for those:
var stream = new JSONStream( socket_handle );
You can then add a listener for the json
event to receive a fully parsed JSON document, or call write()
to send one. Example:
stream.on('json', function(data) {
console.log("Got data: ", data);
} );
stream.write({ action: "something", code: 1234 });
You will always receive pre-parsed JSON as a data object, and write()
handles all serialization for you as well. So you never have to call JSON.parse()
or JSON.stringify()
directly.
Use With Child Processes
Here is a more complete example, which attaches a read/write JSON stream to a child process, sets up a read listener, and writes to the child:
var JSONStream = require('ijs-ipc-json-stream');
// spawn worker process
var child = require('child_process').spawn(
'node', ['my-worker.js'],
{ stdio: ['pipe', 'pipe', 'pipe'] }
);
// connect json stream to child's stdio
// (read from child.stdout, write to child.stdin)
var stream = new JSONStream( child.stdout, child.stdin );
stream.on('json', function(data) {
// received data from child
console.log("Got data from child: ", data);
} );
// write data to child
stream.write({
action: 'update_user_record',
username: 'jhuckaby',
other: 12345
});
// close child's stdin so it can exit normally
child.stdin.end();
You can also use a JSON stream in the child process itself, to handle the other side of the pipe:
var JSONStream = require('ijs-ipc-json-stream');
var stream = new JSONStream( process.stdin, process.stdout );
stream.on('json', function(data) {
// got data from parent, send something back
stream.write({ code: 0, description: "Success from child" });
} );
Use With Network Sockets
You can also use JSON streams over network sockets, providing an easy way to send structured data to/from your clients and servers. For example, on the server side you could have:
var server = require('net').createServer(function(socket) {
// new connection, attach JSON stream handler
var stream = new JSONStream(socket);
stream.on('json', function(data) {
// got gata from client
console.log("Received data from client: ", data);
// send response
stream.write({ code: 1234, description: "We hear you" });
} );
});
server.listen( 3012 );
And on the client side...
var client = require('net').connect( {port: 3012}, function() {
// connected to server, now use JSON stream to communicate
var stream = new JSONStream( client );
stream.on('json', function(data) {
// got response back from server
console.log("Received response from server: ", data);
} );
// send greetings
stream.write({ code: 2345, description: "Hello from client!" });
} );
End of Lines
By default, the library assumes each JSON record will be delimited by the current operating system's end-of-line character sequence (os.EOL), which is \n
on Unix/Linux/OSX. However, you can change this by setting the EOL
string property on your class instance:
var stream = new JSONStream( socket_handle );
stream.EOL = "\r\n"; // DOS line endings
Performance Tracking
If you happen to use our ijs-perf module in your application, you can pass in a performance tracker by calling setPerf()
on a JSON Stream. Example:
stream.setPerf( perf );
This will track the total JSON parse time, the JSON compose time, and the JSON payload sizes on both reads and writes. Also, if any stream write()
calls happen to return false
(i.e. buffered), a special json_stream_write_buffer
perf counter is incremented. Here are all the performance tracking keys used:
| Perf Key | Type | Description |
|----------|------|-------------|
| json_stream_parse
| Elapsed Time | Time spent parsing JSON. |
| json_stream_compose
| Elapsed Time | Time spent composing JSON. |
| json_stream_bytes_read
| Counter | Number of bytes read from stream. |
| json_stream_bytes_written
| Counter | Number of bytes written to stream. |
| json_stream_msgs_read
| Counter | Number of JSON messages read from stream. |
| json_stream_msgs_written
| Counter | Number of JSON messages written to stream. |
| json_stream_write_buffer
| Counter | Number of times the stream write()
call returned false
. |