safe_json_stream
v1.1.3
Published
Safe JSON stream
Downloads
14
Readme
Safe JSON stream
This stream module differs from others in that it prevents DOS attacks by checking on the fly JSON elements and their subelements size. By doing this we avoid following:
a) out of memory problem b) blocking for too long problem
This module is intended for streaming of array, which can contain other arrays, strings, numbers....
example of valid data:
[["aa"], ["b"], {"test": ["n1","n2", 1000]}, ["c"], {"test2":"xxxxxxxxxxx"}, ["d"], ["e"], ["f"], ["g"]]
elements are checked against size in next packets:
["aa"] // length of this element is 6 bytes ["b"] {"test": ["n1","n2", 1000]} // length of this element and subelements together is 27 bytes ["c"] {"test2":"xxxxxxxxxxx"} ...
Note: If we specify element_size_limit parameter as 20 bytes, then the third element will cause the error to be thrown from stream
Donate
My income consists of donations for my projects. If this module is useful to you, consider making a donation!
You can donate using PayPal or Wire transfer.
Note
This module is being frequently updated and newly found errors/bugs will be fixed as soon they are discovered. This version differs from previous, in that JSON string elements are returned from stream. This is for reasons of speed. If you need JSON dictionary, just use JSON.parse on returned strings. I am planning in near future to create even faster version of Safe JSON stream, which would be implemented as Node.js C++ addon.
Discussion
Out of memory happens in JSON streaming, when particular element in JSON array is too large. With this module this never happens, since we can adjust max number of bytes, that are contained in element (and it's subelements). With overflow, error is being thrown out of stream and we can catch it.
Blocking for too long happens with some of existing modules in following situation: You have like thousands of elements in JSON array, that are small strings. Then we change in the middle of array one string to contain a lot of characters (like 10 MB of size). While running those streams, blocking happens for some time, when this very large element is being converted from pure string to JSON dictionary. Again, this does not happen with my module.
Example
async function perform_stream()
{
var readFile = fs.createReadStream(__dirname + '/test.json',
{
flags: 'r',
'bufferSize': 11 * 1024,
'highWaterMark': 11 * 1024,
'objectMode': false,
});
readFile.setEncoding('utf8');
var safe_json_stream_instance = new safe_json_stream({ element_size_limit: 8 }); // limit element and it's subcontent size to 8 bytes
safe_json_stream_instance.on('data', function(income_data) // data comes in form of string variable
{
// if we need JSON dictionary form of incoming data, we use JSON.parse
console.log('Element passed : ' + income_data);
});
try
{
await pipeline(
readFile,
safe_json_stream_instance
);
}
catch (e)
{
console.log('Error: ' + e.toString());
return;
}
console.log('JSON safe stream is over!');
}