npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

mongoose-unique-shard

v0.3.0

Published

When sharding unique indexes are only allowed for keys matching the shard index. This module allows to still define other properties to be unique which will then be modelled using an additional collection to keep track of unique fields.

Downloads

10

Readme

mongoose-unique-shard

When sharding unique indexes are only allowed for keys matching the shard index. This module allows to still define other properties to be unique which will then be modelled using an additional collection to keep track of unique fields.

How to use?

    var mongoose = require('mongoose');
    var Schema = mongoose.Schema;
    var mongooseUniqueShard = require('mongoose-unique-shard');
    var testSchema = new Schema({
      uniqueKey: { type: String, uniqueShardIndex: true },
      subdoc: {
        uniqueSubKey: { type: String, uniqueShardIndex: true }
      },
      combinedKey1: String,
      combinedKey2: { type: String }
    });
    testSchema.plugin(mongooseUniqueShard, { mongoose: mongoose }); // make sure that you pass your local mongoose to the plugin so that we create our collection in the correct database
    testSchema.addUniqueShardIndex(['combinedKey1', 'combinedKey2']);
    var TestModel = mongoose.model('test', testSchema);
    var tm1 = new TestModel();
    tm1.uniqueKey = 'foo';
    tm1.save(function() { // will work
        var tm2 = new TestModel();
        tm2.uniqueKey = 'foo';
        tm2.save(); // will fail as there is already a model holding this unique key in the database
    });

How does this work?

As unique keys are not allowed in a sharded environment we have to use a separate collection to check for already existing unique key values. If you make use of this model an additional mongo collection will 'mongooseuniqueshards' will be created in your database holding the information used to do the unique checks.

When validating an object we check the collection whether it contains an object that would violate the uniqueness of the current object. In case such an object is present validation of the object will fail. Make sure that your objects are validated prior to saving them.

What happens if I delete documents?

If you use the document.delete function we will cleanup old indexes for you. Since it might happen that the original document is deleted by other means we keep a back reference to the _id of the document and its collection with each lock. When a lock for a value pair for a certain document is found, we double check that this document does still exist. In case it has been deleted we will also clean up the lock object. Beware: So far we do not double check the values in the reread document. Make sure that you do not modify documents outside of this applications as the lock objects will not be updated.

Bummers

  • Missing transactions could lead to inconsistencies: Ay you will know Mongo has no transaction support. Inserting the document and inserting the blocker are two separate insertions. It might be that two documents both checked in parallel for the uniqueness object to be present. Both might see that it does not yet exist and then try to write this object. One write will fail as the uniqueness objects themselves have to be unique.
  • Old indexes will be removed during the validaton process: Updating the blocking indexes is done prior to saving the final document. If saving the final document fails for some reason the new blocking indexes will have already been presisted. This is no problem due to the double-check nature of the plugin, but if there have been any old unique blockers from an older version of the document it will have been deleted. In this case it might create an inconsistency allowing documents with duplicate keys to be created.

Because of the bummers you might want to have a batch job double checking your collection that it still ensures the uniqueness completely.

I need more info/help. I found a bug.

Feel free to contact me or add an issue to the issue backlog on github.

I want testing!

Run npm test to run the test suite. But make sure you have a mongo database running at mongodb://localhost:27017