yadbf
v3.3.0
Published
An error-throwing DBF/dbase parsing transform stream
Downloads
36
Readme
YADBF (Yet Another DBF parser)
This project is a streaming DBF parser that throws errors for the slightest transgressions.
It was originally written to support .dbf files found in Esri shapefiles for the OpenAddresses submit-service but it may work for your .dbf file.
Requirements
Node.js 8 or higher is required.
Installation
Using yarn:
$ yarn add --save yadbf
In Node.js:
const YADBF = require('yadbf');
const fs = require('fs');
fs.createReadStream('file.dbf')
.pipe(new YADBF())
.on('header', header => {
console.log(`header: ${JSON.stringify(header, null, 2)}`);
})
.on('data', record => {
console.log(`record: ${JSON.stringify(record, null, 2)}`);
})
.on('end', () => {
console.log('Done!');
})
.on('error', err => {
console.error(`an error was thrown: ${err}`);
});
Options
The following options are available and can be passed to the constructor in a single object parameter:
| Name | Type | Description | Default |
| --- | --- | --- | --- |
| deleted
| boolean | records flagged as deleted should be returned, non-boolean value is treated as "not supplied" | false
|
| offset
| integer | number of records to process before emitting | 0
|
| size
| integer | number of records to emit | Infinity
|
| encoding
| string | encoding supported by iconv-lite | utf-8
|
offset
and size
are implemented to follow pagination functionality. Errors are thrown if any option value type is not the supported type.
Using the pagination functionality destroys the stream after the records within the requested page have been pushed. For example, if offset=20
and size=10
are supplied in options
, then YADBF will immediately stop processing after the 30th record.
Notes
Deleted records do not affect operation of offset
and size
options. That is, if the entire .dbf contains 2 records, deleted and not deleted, respectively, then offset
and size
both set to 1
would output the second record.