npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@openfinance/dsl-queries

v2.3.3

Published

A small node library for normalizing and validating incoming DSL queries in a specific format.

Downloads

1

Readme

DSL Queries

NOTE: This library is now hosted on github packages at https://github.com/openfinanceio/ts-dsl-queries. For the latest updates, please use the package @openfinanceio/ts-dsl-queries and point npm to github package repo (github guide).

This library was intended to make general data filter queries more manageable. The problems it attempts to solve are that, when we request data from some datasource (say, a REST API or a database), a) that request should be agnostic about the underlying datasource implementation, but often isn't; and b) we want to be able to validate the incoming query to both offer useful feedback to the user about how to make the query valid, and also to ensure that the service is not being abused.

To that end, this library codifies a general "parse tree" data structure written in JSON that can be used to easily understand an incoming query. The query would take the form of

  • A single QueryLeaf object;
  • A single QueryNode object (assumes 'and') with one or more QueryLeafs; or
  • A full DslQuery object (again, assumes 'and' if no o value provided)

For reference, the above objects are defined as follows:

QueryLeaf = [FieldName, ComparisonOperator, Value | Array<Value>];

QueryNode = Array<QueryLeaf | DslQueryData>;

DslQueryData {
  o: "and" | "or";
  v: QueryNode;
}

// Value is a short-hand alias:
Value = string | number | boolean | null;

A query can be as simple as ["myField","=","someval"], but it can also be more complex. A query of medium complexity might be, for example, [["myField","=","someval"],["otherfield","!=","otherval"]]. In this case, "and" is the implied logical operator. Finally, you can write very complex queries like, for example, this one:

{
  o: "and",
  v: [
    ["name", "=", "test"],
    {
      o: "or",
      v: [
        ["age", ">", 30],
        ["status", "in", ["deceased", "disabled"]],
        {
          o: "and",
          v: [["parent", "in", ["bob", "tammy"]], ["status", "=", "youthful"]]
        }
      ]
    }
  ]
}

This translates to the following more "human-readable" form:

name === "test" &&
(
  age > 30 ||
  status in ("deceased", "disabled") ||
  (
    status === "youthful" &&
    parent in ("bob", "tammy")
  )
)

On the back-end, the library allows you to specify constraints on the incoming query, including valid field names and the acceptable operators that can be used with those fields (falling back on defaults). It also provides functionality for translating the query into a well-formed, implementation-specific query string with parameters, such as a sql query (default).

Examples

On the front-end, you might do something like this (pseudo-code):

const response = await request("GET", "https://my.api.com/users", {
  params: [
    [ "filter", JSON.stringify(["email", "like", "%example.com"]) ]
  ]
});

// use response.....

On the back-end, when you receive this query, you might do something like this:

import * as Errors from "@openfinance/http-errors";
import { DslQuery } from "@openfinance/dsl-queries";

// ....

// Define what fields are allowed, and which operators are allowed for those fields.
// Note that the library has some knowledge of what values are valid for each
// operator type. For example, "in" can accept an array of values, while "=" cannot and
// will throw an exception.
const filterSpec = {
  fieldSpecs: {
    "name": ["=", "!="],
    "email": ["=", "!=", "in", "like", "not like"]
  }
}

try {
  const filter = new DslQuery(req.params.filter, filterSpec);

  // The user may not have passed a filter at all, so we need to check for that
  if (filter) {
    // Maybe do a little sanity checking?
    if (filter.has("name") && filter.has("email")) {
      throw new Errors.BadQuery("Can't query both name and email (BS example, whatever)", "NameAndEmail");
    }
  }

  const sql = filter.toString(); // returns `[ "email like ?", ["%example.com"]]`
  const result = await someDatasource.query(sql[0], sql[1]);
  res.code(200).send({
    data: result
  });
} catch (e) {
  // DslQuery will throw an HttpError (from [@openfinance/http-errors](https://www.npmjs.com/package/@openfinance/http-errors))
  // If not that, just convert it for easy responses
  if (!Errors.isHttpError(e)) {
    e = Errors.InternalServerError.fromError(e);
  }

  // Return an error response
  res.code(e.status).send({
    errors: e.obstructions.length > 0
      ? e.obstructions.map((o) => { code: o.code, title: e.name, detail: o.text })
      : [{ code: e.code, title: e.name¸ detail: e.message }]
  });
}

The above code handles a considerable amount of validation for you. If the user passes in a random string, it throws a useful error explaining what it's expecting instead. If the user passes in malformed JSON, it lets them know. If the user passes in unacceptable fields or operators that aren't available for a given field, it tells them what the problem is.

Currently, however, there is no way to do value validations, nor is it particularly easy to make sense of complex queries. For example, if a user passes in a query like [["name","=","me"],["name","!=","me"]], there is currently no general method that the library uses to flag that this query is a no-op, and so it would simply pass the query on to the datasource and you would never get any results.

To-Do

  • Add optional value constraints to QuerySpec object