npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

elk-logging-core

v0.1.4

Published

Core logging tools

Downloads

1

Readme

Library

The library is designed to simplify sending logs from front-end applications to elasticsearch for further visualization, for example, using Kibana

!The library is under development!

Motivation

In the process of testing applications, errors often arise that are difficult to replicate. This type of error also includes “floating bugs” that occur with some periodicity under strictly defined conditions. To simplify the process of finding and eliminating such errors, it is better to know where they occur, in which parts of the code and under what conditions.

To track actions that lead to errors, various logging systems are used, which allows you to store user actions and errors that occur along with time stamps in one place. When developing the back-end, there are many ways to organize the storage of logs: files, records in the DBMS, etc. When developing a front-end, developers cannot afford to use these concepts, so we need a fundamentally new way.

To implement such a system when developing the front-end, the logical way is to use a logging system that can receive data using some kind of Web API. Also, the best solution is to store all the logs in one place, which will reduce the number of systems used by different applications, and, accordingly, the time to support them. Based on this, we get the main idea - storing centralized logs of the front-end parts of the application that can be created using the Web API (for example, REST).

Approaches to solving the problem

To solve this problem, there are many specialized systems: Sentry Rollbar Raygun Airbrake Bugsnag Tracejs ... All systems have excellent support, we recommend that you familiarize yourself with them. However, for small projects, their cost may be a negative factor.

Alternative approach

As an alternative, you can use the bundle that is often used when monitoring backend services - ELK-stack. This system is a bunch of 3x (often 4x) components:

  1. Elasticsearch. Search system. It is a NoSQL DBMS with a powerful search engine. Allows you to record using the REST API.
  2. Kibana. Visualization system. Used to visualize data from various sources, including Elasticsearch.
  3. Logstash. The system used to collect and analyze logs, convert them and send them to Elasticsearch.
  4. Filebeat (Optional). Used to collect logs from various sources and send them to Logstash.

In the case of developing front-end applications, you can exclude the last two points from this bundle and leave only Elasticsearch and Kibana. Logs are sent directly to Elasticsearch and rendered using Kibana.

Setting up Elasicsearch and Kibana

Applications can be deployed using docker images. The only configuration you need to do is add CORS support to Elasticsearch. Because the system is centralized, logs must be received from different places, therefore cross-domain requests take place. To do this, add the following lines to the config:

http.cors.allow-origin:"*" http.cors.enabled:true http.cors.allow-credentials:true http.cors.allow-methods:OPTIONS, GET, HEAD, POST, PUT, DELETE http.cors.allow-headers:X-Requested-With, X-Auth-Token, Content-Type, Content-Length, Authorization, Access-Control-Allow-Headers, Accept

The goal of this library

The package provides a logger, enum with logging levels and an error that is thrown exclusively by the logger (to prevent cyclic processing and logging when writing error hooks). The logger object is configured using the config object. This object contains the following properties:

    • application *. Used to configure the index of records in Elasticsearch. All entries in the application will be created under this index and type "frontend-application".
    • url (example: https: // localhost: 9200) *. Used to configure the Elasticsearch address.
    • timeout * (optional, default = 5000). Used to configure the response timeout from the service.
    • deprecated * (optional, default = false). Must be set when using elasticsearch <6

After configuration, you can use the methods of this instance inside the application.

Customization of visualization in Kibana

  1. Send a test log to Elasticsearch
  2. Open settings → index management → create index CreateIndex
  3. Go to the visualization section. Logs for the application will be displayed here. Visualization

Solution to possible problems

  • I can not create an Index Pattern in Kibana, because my index is not listed.

Try to log at least one message, the index should be created by itself if the server accepted your request. The error when sending the log can be tracked in the developer's console in the browser. A request to http: // : / _bulk should return a 200 response code.

  • Index Pattern in Kibana is created indefinitely.

Try sending the following queries: curl -X PUT : / _all / _settings -d '{"index.blocks.read_only_allow_delete": null}'

curl -X PUT : / _cluster / settings -d '{"transient": {"cluster.routing.allocation.disk.threshold_enabled": false}