npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dat-core

v6.0.2

Published

dat core

Downloads

84

Readme

dat-core

The core implementation of dat

npm install dat-core

build status dat

Usage

var dat = require('dat-core')

var db = dat('./test')

db.put('hello', 'world', function (err) { // insert value
  if (err) return handle(err) // something went wrong
  db.get('hello', function (err, result) {
    if (err) return handle(err) // something went wrong
    console.log(result)   // prints result
    console.log(db.head) // the 'head' of the database graph (a hash)
  })
})

API

db = dat(pathOrLevelDb, [options])

Create a new dat instance.

Options
  • checkout - database version to access. default is latest
  • valueEncoding - 'json' | 'binary' | 'utf-8' or a custom encoder instance
  • createIfMissing - true or false, default false. creates dat folder if it doesnt exist
  • backend - a leveldown compatible constructor to use (default is require('leveldown'))
  • blobs - an abstract-blob-store compatible instance to use (default is content-addressable-blob-store)

Per default the path passed to the backend is {path}/.dat/db. If your custom backend requires a special url simply wrap in a function

var sqldown = require('sqldown')
var db = dat('/some/regular/path', {
  backend: function () {
    return sqldown('pg://localhost/database')
  }
})

db.head

String property containing the current head revision of the dat. Everytime you mutate the dat this head changes.

db.init([cb])

Inits the dat by adding a root node to the graph if one hasn't been added already. Is called implicitly when you do a mutating operation.

cb (if specified) will be called with one argument, (error)

db.put(key, value, [opts], [cb])

Insert a value into the dat

cb (if specified) will be called with one argument, (error)

Options
  • dataset - the dataset to use
  • valueEncoding - an encoder instance to use to encode the value

db.get(key, [options], cb)

Get a value node from the dat

cb will be called with two arguments, (error, value). If successful, value will have these keys:

{
  content:  // 'file' or 'row'
  type:     // 'put' or 'del'
  version:  // version hash
  change:   // internal change number
  key:      // row key
  value:    // row value
}
Options
  • dataset - the dataset to use
  • valueEncoding - an encoder instance to use to decode the value

db.del(key, [cb])

Delete a node from the dat by key

cb (if specified) will be called with one argument, (error)

db.listDatasets(cb)

Returns a list of the datasets currently in use in this checkout

cb will be called with two arguments, (error, datasets) where datasets is an array of strings (dataset names)

set = dat.dataset(name)

Returns a namespaced dataset (similar to a sublevel in leveldb). If you just use dat.put and dat.get it will use the default dataset (equaivalent of doing dat.dataset().

stream = db.createReadStream([options])

Stream out values of the dat. Returns a readable stream.

stream = db.createWriteStream([options])

Stream in values to the dat. Returns a writable stream.

Options
  • dataset - the dataset to store the data in
  • message - a human readable message string to store with the metadata for the changes made by the write stream
  • transaction - boolean, default false. if true everything written to the write stream will be stored as 1 transaction in the history
  • batchSize - default 128, the group size used to write to the underlying leveldown batch write. this also determines how many nodes end up in the graph (higher batch size = less nodes)
  • valueEncoding - override the value encoding set on the dat-core instance
Data format

When you write data to the write stream, it must look like this:

{
  type:     // 'put' or 'del'
  key:      // key
  value:    // value
}

stream = db.createFileReadStream(key, [options])

Read a file stored under the key specified. Returns a binary read stream.

stream = db.createFileWriteStream(key, [options])

Write a file to be stored under the key specified. Returns a binary write stream.

stream = db.createPushStream([options])

Create a replication stream that both pushes changes to another dat

stream = db.createPullStream([options])

Create a replication stream that both pulls changes from another dat

stream = db.createReplicationStream([options])

Create a replication stream that both pulls and pushes

stream = db.createChangesStream([options])

Get a stream of changes happening to the dat. These changes are ONLY guaranteed to be ordered locally.

stream = db.heads()

Get a stream of heads in the underlying dat graph.

stream = db.layers()

Get a stream of layers in the dat.

A layer will added if both you and a remote make changes to the dat and you then pull the remote's changes.

They can also happen if you checkout a prevision revision and make changes.

stream = db.diff(branch1, branch2)

Compare two or more branches with each other. The stream will emit key,value pairs that conflict across the branches

stream = db.merge(branch1, branch2)

Returns a merge stream. You should write key,value pairs to this stream that conflicts across the branches (see the compare method above).

Once you end this stream the branches will be merged assuming the don't contain conflicting keys anymore.

anotherDat = db.checkout(ref)

Checkout an older revision of the dat. This is useful if you want to pin your data to a point in time.

db.put('hello', 'world', function () {
  var oldHash = db.head
  db.put('hello', 'verden', function () {
    var oldDb = db.checkout(oldHash)

    oldDb.get('hello', function (err, result) {
      console.log(result) // contains 'hello' -> 'world'
    })

    db.get('hello', function (err, result) {
      console.log(result) // contains 'hello' -> 'verden'
    })
  })
})

If you want to make this checkout persistent, i.e. your default head, set the {persistent: true} option

var anotherDat = db.checkout(someHead, {persistent: true})

anotherDat.on('ready', function () {
  // someHash if your default head now if you create a new dat instance
})

To reset your persistent head to the previous use db.checkout(false, {persistent: true})

Custom Encoders

Wherever you can specify valueEncoding, in addition to the built in string types you can also pass in an object with encode and decode methods.

For example, here is the implementation of the built-in JSON encoder:

var json = {
  encode: function (obj) {
    return new Buffer(JSON.stringify(obj))
  },
  decode: function (buf) {
    return JSON.parse(buf.toString())
  }
}

License

MIT