npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

sql-ddl-to-json-schema

v4.1.1

Published

Parse and convert SQL DDL statements to a JSON Schema.

Downloads

3,119

Readme

SQL DDL to JSON Schema converter

Tests npm node license

Transforms SQL DDL statements into JSON format (JSON Schema and a compact format).

Overview

Taking the following SQL:

CREATE TABLE users (
  id INT(11) NOT NULL AUTO_INCREMENT,
  nickname VARCHAR(255) NOT NULL,
  deleted_at TIMESTAMP NULL,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP NOT NULL,
  updated_at TIMESTAMP,
  PRIMARY KEY (id)
) ENGINE MyISAM COMMENT 'All system users';

ALTER TABLE users ADD UNIQUE KEY unq_nick (nickname);

It parses and delivers an array of JSON Schema documents (one for each parsed table):

[
  {
    "$schema": "http://json-schema.org/draft-07/schema",
    "$comment": "JSON Schema for users table",
    "$id": "users",
    "title": "users",
    "description": "All system users",
    "type": "object",
    "required": [
      "id",
      "nickname",
      "created_at"
    ],
    "definitions": {
      "id": {
        "$comment": "primary key",
        "type": "integer",
        "minimum": 1,
        "maximum": 1.5474250491067253e+26
      },
      "nickname": {
        "type": "string",
        "maxLength": 255
      },
      "deleted_at": {
        "type": "string"
      },
      "created_at": {
        "type": "string"
      },
      "updated_at": {
        "type": "string"
      }
    },
    "properties": {
      "id": {
        "$ref": "#/definitions/id"
      },
      "nickname": {
        "$ref": "#/definitions/nickname"
      },
      "deleted_at": {
        "$ref": "#/definitions/deleted_at"
      },
      "created_at": {
        "$ref": "#/definitions/created_at"
      },
      "updated_at": {
        "$ref": "#/definitions/updated_at"
      }
    }
  }
]

And an array of tables in a compact JSON format:

[
  {
    "name": "users",
    "columns": [
      {
        "name": "id",
        "type": {
          "datatype": "int",
          "width": 11
        },
        "options": {
          "nullable": false,
          "autoincrement": true
        }
      },
      {
        "name": "nickname",
        "type": {
          "datatype": "varchar",
          "length": 255
        },
        "options": {
          "nullable": false
        }
      },
      {
        "name": "deleted_at",
        "type": {
          "datatype": "timestamp",
          "fractional": 0
        },
        "options": {
          "nullable": true
        }
      },
      {
        "name": "created_at",
        "type": {
          "datatype": "timestamp",
          "fractional": 0
        },
        "options": {
          "nullable": false,
          "default": "CURRENT_TIMESTAMP"
        }
      },
      {
        "name": "updated_at",
        "type": {
          "datatype": "timestamp",
          "fractional": 0
        },
        "options": {
          "nullable": true
        }
      }
    ],
    "primaryKey": {
      "columns": [
        {
          "column": "id"
        }
      ]
    },
    "uniqueKeys": [
      {
        "columns": [
          {
            "column": "nickname"
          }
        ],
        "name": "unq_nick"
      }
    ],
    "options": {
      "comment": "All system users",
      "engine": "MyISAM"
    }
  }
]

Currently only DDL statements of MySQL and MariaDB dialects are supported. - Check out the roadmap

Installation

yarn add sql-ddl-to-json-schema
npm i sql-ddl-to-json-schema

Usage

Shorthand

const { Parser } = require('sql-ddl-to-json-schema');
// or:
import { Parser } from 'sql-ddl-to-json-schema'

const parser = new Parser('mysql');

const sql = `
CREATE TABLE users (
  id INT(11) NOT NULL AUTO_INCREMENT,
  nickname VARCHAR(255) NOT NULL,
  deleted_at TIMESTAMP NULL,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP NOT NULL,
  updated_at TIMESTAMP,
  PRIMARY KEY (id)
) ENGINE MyISAM COMMENT 'All system users';

ALTER TABLE users ADD UNIQUE KEY unq_nick (nickname);
`;

/**
 * Read on for available options.
 */
const options = {};

/**
 * Explore the compact JSON format...
 */
const compactJsonTablesArray = parser.feed(sql).toCompactJson(parser.results);

/**
 * Or get the JSON Schema if you need to modify it...
 */
const jsonSchemaDocuments = parser.feed(sql).toJsonSchemaArray(options, compactJsonTablesArray);

Step by step

/**
 * Read on for available options.
 */
const options = { useRef: true };

/**
 * Feed the parser with the SQL DDL statements...
 */
parser.feed(sql);

/**
 * You can get the parsed results in JSON format...
 */
const parsedJsonFormat = parser.results;

/**
 * And pass it to be formatted in a compact JSON format...
 */
const compactJsonTablesArray = parser.toCompactJson(parsedJsonFormat);

/**
 * Finally pass it to format to an array of JSON Schema items. One for each table...
 */
const jsonSchemaDocuments = parser.toJsonSchemaArray(options, compactJsonTablesArray);

Options for JSON Schema output

There are a few options when it comes to formatting the JSON Schema output:

useRef

Whether to add all properties to definitions and in properties only use $ref.

Default value: true.

Version compatibility table

| This lib version range | NodeJS version range | Angular support | Other browser-based JS support | | ---------------------- | -------------------- | --------------- | ------------------------------ | | <= 3.x | >= 6.x | No | Yes | | >= 4 | >= 8.6 | Yes | Yes |

What it is, what it is not

It is a SQL DDL parser for Javascript, based on nearley. It will parse DDL statements only, converting it to JSON. No DML is supported.

It is not a SQL DBMS, nor a SQL Server, nor SQL client.

About

No SQL server, client or DBMS is required.

To see which DDL statements / SQL dialects are supported, check out the roadmap.

This project is a grammar and stream-friendly SQL parser based on nearley.

Contributing

You are welcome to contribute!

Preferably use npm, as all scripts in package.json are run through npm.

  • Clone this repo
  • Install dependencies: npm i

Commiting

Prefer using the latest version of NodeJS.

To commit, use commitizen: git cz (you will need to have installed devDependencies: npm i).

Understanding the internals

Folder structure:

|- lib/                   Compiled library folder, product of this project.
|
|- src/
|  |- typings/            Types used throughout project.
|  |- shared/             Shared files used by dialects, parsers and formatters.
|  |- mysql/
|     |- formatter/       Formats the parsed JSON (output of parser) to other format.
|        |- compact/      Formatter for compact JSON format.
|        |- json-schema/  Formatter for JSON Schema format.
|     |- language/
|        |- dictionary/   TS files with array of keywords and symbols used in lexer.ne.
|        |- rules/        Nearley files with grammar rules.
|        |- lexer.ne      Entrypoint and first lines of the grammar.
|
|- tasks/
|  |- mysql/
|     |- assembly.ts      Script that concatenates all .ne files to grammar.ne to lib folder.
|     |- formatters.ts    Script that sends a copy of formatters to lib folder.
|
|- test/                  Tests.
  • There are naming rules for tokens in ne files, as stated in lexer.ne. They are prepended with:

K_ -> Keywords
P_ -> Phrase (aka statements)
O_ -> Options (one of several keywords or phrases)
S_ -> Symbol (not a keyword, but chars and other matches by RegExp's)
  1. The dictionary/keywords.ts file contains keywords, but they are prepended with K_ when used in .ne files. Take a look to make sure you understand how it is exported.

  2. The compiled grammar.ne file comprises an assembly (concatenation) of lexer.ne and files in language folder. So don't worry about importing .ne files in other .ne files. This prevents circular dependency and grammar rules in lexer.ne are scoped to all files (thus not having to repeat them in every file).

Scripts at hand

Valid to all SQL dialects:

  • Assemble grammar.ne and compile to grammar.ts: npm run build
  • Same as above, but watch for changes: npm run build:watch
  • Run tests: npm run test
  • Test and watch for changes: npm run test:watch

Visual Studio Code

Debug launch config is versioned in this repository.

Links