npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

datamigrationscript

v1.0.3

Published

Convert Excel data models to SQL Scripts

Downloads

7

Readme

NodeJS SQL Data Migration Tool

Easily convert data models in spreadsheet into SQL or PLSQL Scripts

Features

  • Easy to install and import into your project - as pure functions and code or as a NodeJS Module
  • Easily add extra helper methods to cater to own needs
  • With some tweaks you can make models or SQL Scripts from other data types e.g. JSON or JS Objects so you can do JSON to SQL or Object to SQL
  • Very Modular and flexible

Dependencies

NodeJS Data Migration Script uses a number of open source projects to work properly:

  • [XLSX] - Npm module for working with Spreadsheets
  • [NodeJS] - NodeJS core modules such as writeFile() and readFile()
  • [JavaScript] - Vanilla ES6

And of course Data Migration Script itself is open source with a [public repository][dill] on GitHub.

Installation

Dillinger requires Node.js v10+ to run.

Install the dependencies and devDependencies and start the server.

In your project directory run npm installer and then inmport it to your main module or component.

npm i datamigrationscript
var dmscript = require('./datamigrationscript');

The only difference in installing this as a module compared to most other modules is that our script will just run and output the SQL script as defined in the module's core file - index.js - which is located in NODE_MODULES/datamigrationscript/index.js. So if you want your script to change you CANNOT change it outside the module as the module doesn't offer this flexibility yet.

If you want to simply bring in the script/tool into your code then just copy and paste the code from the main file and save it as index.js, install the dev dependencies and do npm install:

  1. Create index.js file and paste this code in or you can embed it in say your main class/file e.g. main.js
  //+++++++++ NOTES ++++++++++++\\
  // Title: SQL Script Generator\\
  // Ver:  0.0.1                \\
  // Date: 20022021             \\
  //_________By S A Masoud______\\
  
// Requiring the module 
const reader = require('xlsx');
//fs module
const fs = require('fs');  
// Reading our test file 
const file = reader.readFile('./sheetsToRead/customers.xlsx');
//store only rows
let rowNames = [];
//create array obj from found sheets
const sheets = file.SheetNames
//main engine
mainEngine = () => {
    for(let i = 0; i < sheets.length; i++) 
    { 
        //specify sheet name/workbook in string
        //comment this if condition to read all workbooks instead
        //in this instance our workbook is called Customers
        if(file.SheetNames[i] === 'Sheet1'){ 
            //use the util to convert sheets to json
            const sheetObjects = reader.utils.sheet_to_json(file.Sheets[file.SheetNames[i]])
            //for each of sheets from the sheet if(file.SheetNames[i] === 'Registration') get Col Names and push to an array
            //in our workbook the first column is called 'Col Name' which contains the columns for the table
            sheetObjects.forEach((res) => { rowNames.push(res['Col Name']) });
    
            //specify table name for dest db and a comments variable
            let tblname = 'customers';
            let comments = '';

                            //++++ BUILD OF SQL SCRIPT BODY ++++ \\\
            //build a dynamic object using pure strings that contains our SQL structure
            //so for a table creation we start with CREATE TABLE obviously
            let script = 'CREATE TABLE '+ tblname + '( \n';
            //from the sheet selected loop through to get all columns and filter through and build accordingly // we can instead use the rowNames to get columns
            //for some datatypes because they were incorrect in the Workbook we change them via the if statements
            for(let res in sheetObjects){
                //ADD ALL Col NameS WITH A 'CompanyName_' PREFIX AND ADD TO SCRIPT
                if( rowNames != undefined){ 
                    script += '   ' + 'CompanyName_' + sheetObjects[res]['Col Name'] + ' ' + sheetObjects[res]['Data Type']  + ', \n';
                }
                //ADD COMMENT SECTIONS TO SCRIPT
                if( sheetObjects[res]['Notes'] == null){
                    comments += '\n COMMENT ON COLUMN ' + tblname + '.' + sheetObjects[res]['Col Name'] + ' IS' + 'No Comments available;';
                } 
            }

            //ADD COMPOSITE KEYS
            script += ' \n -- composite pk \n CONSTRAINT pk_companyname_' + rowNames[0] + ' PRIMARY KEY (CompanyName_'+ rowNames[0] + ',valid_from_date) USING INDEX, \n';
            
            //CONSTRAINT CHECKS
            if(rowNames !== 'undefined'){
                //filter does not work on undefined values so remove them first
                let filteredRowNames = rowNames.filter(function (el) { return el != null; });
                //filter down again with matching element
                let filteredMatchingElement = filteredRowNames.filter(function (el) { return el.includes('_CONSTRAINT'); });
                if(filteredMatchingElement.length < 1){
                    script += ' \n';
                }
                for(let y in filteredMatchingElement){ 
                    script += ' CONSTRAINT chk_'+tblname+'_ind CHECK (' ;
                    const yesNo = "'Y','N'";
                    script += ' \n ' +  filteredMatchingElement[y] + ' IN ('+yesNo+',NULL),'
                }
            }
            script += ' TABLESPACE CUSTOMER_DATA ); \n\n -- comments on individual columns. Taken from Columns and Attributes spreadsheet in Teams \n ';

            //ADD THE COMMENTS TO THE MAIN SCRIPT
            script += comments;

            //CHECK WHOLE SCRIPT BEFORE CREATING THE BUILD FILE
            console.log(script);

            //SAVE WHOLE SCRIPT TO A SQL FILE BY USING NODEJS WRITEFILE UTILITY
            fs.writeFileSync('sqlOutput/'+tblname+'.sql',script);
        }
    } 
}

//execute order
//1. mainEngine
mainEngine();
  1. Install dependencies
npm install xlsx --save-dev
npm install
  1. Then run it as one file/component or run your main.js file if you embedded the code instead:
node index.js