npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@shopline/sl-express

v3.3.0

Published

A nodejs framework based on expressjs

Downloads

590

Readme

Shopline Nodejs Framework

Objective

This framework sets up an organized NodeJS project in an easier and 'classier' manner. Since this is being used internally, we added our way of standardizing the code structure.

This framework wraps around ExpressJS, with the following additions:

  1. App class - A big 'motherboard'
  2. A router
  3. some default middlewares
  4. Dockerfile
  5. project folder structure
App class

This framework has an App Class which acts as a 'motherboard' of the whole application. It controls the phases of the app in the following sequence:

loading phase
  1. load configurations from /config folder
  2. load framework related models and export it to a context(default to be global)
  3. import services folder and export to the context
  4. import models folder and export to the context
  5. import viewModels folder and export as context.ViewModels
  6. import controllers folder and export to the context
starting phase
  1. connect to dependent services (like mongo, redis)
  2. start the service ( default to starting an express server )
stopping phase
  1. stop the service ( defaut to stopping an express server )
  2. disconnect from dependent services
Router

Router is a simple component of the framework. It intends to make routing easier. It intakes a big object, parses it and call the express router to do the routing.

Most of the case, the big routing object will be the config/routes.js

middlewares

The middleware can be called before and after request. the name of the middleware will be same as its filename.

Dockerfile

The framework will provide the a Dockerfile of building a nodejs application image

View models

TBC

Getting Started with an example

installing sl-express framework

  1. Install NodeJS on your machine
  2. Install @shopline/sl-express on your machine npm i -g @shopline/sl-express
  3. Clone this repo to your machine
  4. Go into the example/basic directory on your terminal and execute npm install

starting a sample express server

  1. Run sl-express start on your terminal
  2. Open your browser and go to http://localhost:3000
  3. You should also see a log on terminal that the route has been requested

starting a sample express console

  1. Run sl-express console on your terminal and this will bring you into the express console
  2. Type app.id on your express console to see the time when you start your console

starting a sample express console with async mode

  1. Run sl-express asyncConsole on your terminal and this will bring you into the express console with async mode
  2. Type app.id on your express console to see the time when you start your console

starting the express server with docker run

example/basic/Dockerfile has already been configured

  1. Install Homebrew on your machine
  2. Install docker via Homebrew brew cask install docker
  3. Go into the example/basic directory on your terminal and build a docker image by docker build --tag=test-app .
  4. Create a docker container by docker run -p 3000:3000 test-app
  5. Open your browser and go to http://localhost:3000
  6. Run ctrl+c to stop the container. You can check all the containers you have with docker ps -a
  7. docker rm <CONTAINER_ID> to remove the container

starting the express server with docker-compose

example/basic/docker-compose has already been configured

  1. Go into the example/basic directory on your terminal and build the images by docker-compose build
  2. Create docker containers by docker-compose up
  3. Open your browser and go to http://localhost:3000
  4. Run ctrl+c to stop the container or you can run docker-compose stop

Use cases

customizing the app phases

There are a few phases you can customize by doing method overriding. Always remember to manage the super method

app.prepare()

This phase actually doing the loading phase of the app. most of the case, you would do:

  prepare() {
    super.prepare()
    /* your extra loading here */
  }
app.connectDependencies()

This phase handle the connection to other service like mongo, redis and rabbitmq. By default, it connects to no services. My suggestion will be:

  async connectAwesomeService() {
    /* connect... connect... connect... */
  }

  async disconnectAwesomeService() {
    /* disconnect... disconnect... disconnect... */
  }
  async connectDependencies() {
    super.connectDependencies()
    /* your other connections here */
    try { connectAwesomeService() } catch(e) { throw e }
  }
app.disconnectDependencies()

This phase handle the connection from other service like mongo, redis and rabbitmq. By default, it disconnect from no services. My suggestion will be:

  async connectAwesomeService() {
    /* connect... connect... connect... */
  }

  async disconnectAwesomeService() {
    /* disconnect... disconnect... disconnect... */
  }
  async disconnectDependencies() {
    /* the best practice is to diconnect in a reversed sequence of connections
    /* your other connections here */
    try { disconnectAwesomeService() } catch(e) { throw e }
    super.connectDependencies()

  }
app.startService()

The phase that really start the service. By default, it will start the express server. You can do a customization by condition

async startService() {

  if (/*this.role == 'SERVER'*/) {
    await this.startExpress()
    return
  }

  if (/*this.role == 'WORKER'*/) {
    /* start consuming queue */
    return
  }

}
app.stopService()

The phase to stop the service. By default, it is stopping nothing. You can also do a customization by condition

async stopService() {

  if (/*this.role == 'SERVER'*/) {
    await this.stopExpress()
    return
  }

  if (/*this.role == 'WORKER'*/) {
    /* stop consuming queue */
    return
  }

}

serving an api

routing and middlewares

To add a route, you can simply add a string to routes: []. It will split it by spaces.

The pattern should be:

HTTP_METHOD URI middleware middleware Controller.action

Sometimes you may not want to insert middleware one by one. Then you can use the preMiddlewares. Please check: https://expressjs.com/en/guide/using-middleware.html

The pattern should be:

REGEX middleware middleware

Please also reference: https://expressjs.com/en/guide/routing.html

module.exports = {

  preMiddlewares: [
    '* middleware middleware'
  ],

  routes: [
    'GET /index PublicController.index'
  ]

  postMiddlewares: [],
}
add a controller

This is how a controller should be added to the api/controllers directory.

In this example, router.js can reference the index controller by PublicController.index.

Class PublicController {

  async index(req, res) {

    return res.send('hello world')

  }

}

module.exports = PublicController

connecting to new services

creating new Service

first you will need to create a class under api/services directory.

api/services/AwesomeService.js

let _theLibYouUse = null;
let _sharedAwesomeService = null;

class AwesomeService {
    /* a lazy loading singleton. It ensures the lib would not be required if the service is not used. It may seems a bit dirty for requiring lib in functions. But it makes this service able to move into the core framework some days.
     */

    static get theLibYouUse() {
        if (!_theLibYouUse) {
            _theLibYouUse = require('theLibYouUse');
        }

        return _theLibYouUse;
    }

    /* A singleton. Most of the case you will just need to init one Service instance. You still better do a signleton pattern so that you can do stubbing easily when doing unit test on Model methods that make use of this service */

    static get sharedAwesomeService() {
        if (!_sharedAwesomeService) {
            _sharedAwesomeService = new AwesomeService();
        }

        return _sharedAwesomeService;
    }

    /* As singleton is used, it will be hard to pass the config when initializing the service. That's why we use init instead of constructor. Besides, we may not want to set the config or directly get the global config inside this class because it's better to keep it with fewer dependencies. The config should be passed to the signleton in the motherbroad */

    init(config) {
        this.endpoint = config.endpoint;
        this.abc = config.abc;
    }
}

module.exports = AwesomeService;

config/awesomeService.js

module.exports = {
    endpoint: process.env.AWESOME_SERVICE_ENPOINT
};

.env

/* all env-dependent variable should put in .env file */
AWESOME_SERVICE_ENPOINT=http://awesomeservice.com/api

app.js

  async connectAwesomeService() {

    await AwesomeService.sharedAwesomeService.init(this.config.awesomeService).connect()

  }

  async disconnectAwesomeService() {

    await AwesomeService.SharedAwesomeService.disconnect()

  }

  async connectDependencies() {

    await super.connectDependencies()
    await this.connectAwesomeService()

  }

  async disconnectDependencies() {

    await this.disconnectAwesomeService()
    await super.disconnectDependencies()

  }

config

In most of the frameworks, they like to do a structure like

- config/
  - env/
    - development.js
    - production.js
  - config1
  - config2

And these framework will first gather config1 and config2, and do a overriding with the specified environment config. Yet this framework WON'T do this.

All environment related config should be controlled by .env file

add logging

This framework use log4js wrapped in a plugin logger. Things can be configured in config/logger.js.

Please config your config/app.js

{
    plugins: ['logger'];
}

There is no magic for configuring the Logger. Please visit: https://www.npmjs.com/package/log4js

Most of the cases, you just need to add categories like 'broadcast', 'queueHandling'. It just bases on what feature you want to take log.

Besides, as we are using cloudwatch, we just append our logs to stdout at this moment.

more about logging practce

Log level
  1. debug

  2. trace

  3. warn

  4. error

  5. info

  6. trace: Most of the case we will add trace log every where as we should be able to investigate problems in a black-box system in production

  7. warn: some error that are not exactly exceptional but you want to keep track of these kind of weird behaviour

  8. error: Every exceptional should be logged with error log, no matter it breaks the process or not

  9. info: System-wise log will be assigned to info log, like 'connected mongo'.

Log structure

must-have:

  1. logCategory
  2. logLevel
  3. obj

logCategory: a category to group logs. most of the case it is designed by feature like 'broadcast', 'notificationMessage'. logLevel: like the upper section obj: a obj to be JSON.stringify. WE HIGHLY RECOMMEND YOU ADD THE FOLLOWING: 1. action (string to describe the process), 2. traceId

Use cases with built-in Model / Service

using mongoose for models

There is a built-in model called MongooseModel. This model wants to:

  1. make class declaration using class instead of using prototype
  2. handle the way of mixing the mongoose schema and the class by using mongoose.model
  3. provide a more user friendly way to use the mongoose pre and post hook.
class AwesomeModel extends MongooseModel {
    static schema() {
        /* you can always access the mongoose library with this.mongoose */
        return {
            ownerId: { type: String, required: true }
        };
    }

    static beforeSave(obj, next) {
        //do something
        return next();
    }
}

module.exports = AwesomeModel;
connecting to mongo

Mongo should be connected when MongooseModel is used. There is a static getter function in MongooseModel and App. The one in MongooseModel will return the mongoose lib. The one in App will return the mongoose in MongooseModel. They are actually, most of the time, the same.

There are built-in function for connection mongo. What you need to do is adding ENV to your .env file. Basically, we have a config/mongoose.js in the framework that mapping a mongo endpoint to ENVs so you just need to add ENV.

MONGODB_USER
MONGODB_PASS
MONGODB_HOST
MONGODB_PORT
MONGODB_DATABASE

Please config your config/app.js

{
    plugins: ['mongoose'];
}

Add the following to your docker-compose.yml

version: '3'

services:
    # your app build
    # ...

    mongo:
        image: 'mongo'
        ports:
            - '27017:27017' # configure your port
        volumes:
            - 'mongodb:/data/db'
    # ...
    # your other services (rabbitmq, redis)

volumes:
    mongodb:
        driver: local

using redis

By default we have a config file in framework mapping ENVs to the redis config

REDIS_USER
REDIS_PASS
REDIS_HOST
REDIS_PORT
REDIS_DATABASE

Please config your config/app.js

{
    plugins: ['redis'];
}

Add the following to your docker-compose.yml

version: '3'

services:
    # your app build
    # ...

    redis:
        image: 'redis'
        ports:
            - '6379:6379' # configure your port

    # ...
    # your other services (rabbitmq, mongo)

using messageQueue

By default we have a config file in framework mapping ENVs to the redis config

RABBITMQ_USER
RABBITMQ_PASS
RABBITMQ_HOST
RABBITMQ_PORT
RABBITMQ_PREFETCH_COUNT
RABBITMQ_QUEUE_PREFIX

Please config your config/app.js

{
    plugins: ['messageQueue'];
}

Add the following to your docker-compose.yml

version: '3'

services:
    # your app build
    # ...

    rabbitmq:
        image: 'rabbitmq:3-management'
        ports:
            - '5672:5672' # configure your port
            - '15672:15672'
    # ...
    # your other services (redis, mongo)

using QueueTask

You need to connect to both Redis and Rabbitmq for this feature. By default, we will store the payload of a task to Redis and only send the task id to the rabbitmq. This design will avoid sending too large payload to the rabbitmq. QueueTask, as a model, will handle all this for you.

Please refer to using redis section. Please refer to using messageQueue section.

To get it set up, you need to add the following code:

Add queueTask plugin to config/app.js

{
    plugins: ['queueTask', '...your other plugins'];
}

Add a config file config/queueTask.js

module.exports = [
    {
        type: 'TEST', // an identifier for you task
        queue: 'test_queue', // the CONSUMER_QUEUE_ID or consumerQueueId to handle the queue
        messageExpireTimeSec: 3600 * 24 // set a timeout on message, default to (3600 * 24)s. After the timeout has expired, the message will automatically be discarded.
        handler: 'Test.dequeue', // the handler of the tasks of this type
        description: 'any remarks you want to add'
    }
];

Add a Test.js file on api/models/Test.js to queue and dequeue handling

class Test {
    static async enqueue(test) {
        //do something to make a payload
        let payload = {
            firstName: test.firstName,
            sex: test.sex
        };

        await QueueTask.queue({
            taskType: 'TEST',
            payload: payload
        });
    }

    static async dequeue(queueTask) {
        let payload = queueTask.payload;

        // handle the payload
        console.log(payload.firstName);
    }
}

module.exports = Test;

To test the queueTask feature, you will have to start your app with a consumer and publisher role. To do so, you can use docker-compose to instantiate two containers of the app, one with a consumer role and the other with a publisher role.

Modify your docker-compose.yml

services:
    test-app-publisher:
        build: .
        ports:
            - '3000:3000'
        volumes:
            - .:/app
            - /app/node_modules
        environment:
            - APP_ROLE=PUBLISHER # to start the container using a publisher role
        command: bash -c "chmod +x ./wait-for-it.sh && ./wait-for-it.sh rabbitmq:5672 -- nodemon server.js"

    test-app-consumer:
        build: .
        volumes:
            - .:/app
            - /app/node_modules
        environment:
            - APP_ROLE=CONSUMER # to start the container using a consumer role
            - CONSUMER_QUEUE_ID=test_queue # set the id the same as the queue attr in config/queueTask.js
        command: bash -c "chmod +x ./wait-for-it.sh && ./wait-for-it.sh rabbitmq:5672 -- nodemon server.js"
    #
    #
    # your other services (redis, mongo, rabbitmq)

using Plugin

You can build any plugin you like using Plugin feature. SL-expres will

  1. read the app.config.plugins
  2. read /plugins of YOUR application folder and import the plugin ONLY the key exists in the config
  3. if there are some keys in the config cannot be imported, it try to import them from sl-express. (overriding the default)
// config/app.js

module.exports = {
  plugins: [
    'helloWorld',
    'drinkTea',
  ]
}

the plugin must fulfill the directory structure

// plugins
- helloWorld
  - index.js
- drinkTea
  - index.js;

the export of the index.js must provide the following interfaces

  1. prepare(app) { }
  2. async connectDependencies(app) { }
  3. async disconnectDependencies(app) { }
  4. async willStartService(app) { }
  5. async didStartService(app) { }

These interfaces are related to specific App phases. check the class App for details

app means the App instance. You can get properties through this app instance. Most of the cases, you will need the app.config

in more advanced usages

// plugins/
- helloWorld
  - lib/
    - ModelA.js
    - ModelB.js
    - HelloWorldService.js
    - HelloWorldPlugin.js
  - index.js
  - README.md
// index.js

const HelloWorldPlugin = require('./HelloWorldPlugin')
module.exports = new HelloWorldPlugin()
// HelloWorldPlugin.js

class HelloWorldPlugin {
  prepare(app) {
    const service = new HelloWorldService()
  }

  async connectDependencies(app) { }
  async disconnectDependencies(app) { }
  async willStartService(app) { }
  async didStartService(app) { }
}

module.exports = HelloWorldPlugin

Layers of SL-express application

We can organise our code structure will five kinds of components:

  1. Controller: its actions to receive API calls
  2. AppService: services that provided by our sl-express application to complete use cases by making use of models.
  3. Model: to control the queries, data structure, data change of itself
  4. Service: services that are intaken into the application.
  5. Plugin: a structure that compose of 2,3,4. it can abstruct the whole service you want to provide. It also conform the interface to integrate with the sl-express app

###More about plugin In a simple way, it's a connector between the app and the library / modules you write.

Sometimes, in your application, you may need to solve some use cases that involve a complicated logic flow. it may involve many models and intaken services which only concern this part of this logic but not the app. In this case, you would like to "hide" these models and services and wrap it into a modules. And you may also need to setup a bit before pluging it into the application. Using the integration feature of the plugin layer, your modules can simply focus on the logic.