npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

etp-server

v0.5.8

Published

Javascript implementation of ETP Server

Downloads

731

Readme

ETP Node Server

This is an experimental implementation of a node.js server and html5 client for the Energistics Transfer protocol (ETP). ETP is a proposed specification for streaming real-time data from oil field drilling and production facilities. It uses websockets for transport and Apache Avro for serialization.

This implementation also uses mongodb for storage, although that is not part of the spec.

Contents

[TOC]

Prerequisites

  • Install Node from nodejs.org - v0.10 min required.
  • Install Mongodb - v3.0 min required
  • Running from source requires Linux or a Linux-like windows environment like Cygwin

Installation

c:\>mkdir etpdemo
c:\>cd etpdemo

To install from NPM

npm install etp-server

To install from source

Clone the node folder from bitbucket. Say, into c:\etpdemo

$ make init
$ make -B
$ make test

#Running

To run from NPM installation

c:\ralfdemo>node node_modules/etp-server/bin/server

To run from Source installation

c:\ralfdemo>node dist/bin/server

simple-http-server Now Serving: ./ at http://localhost:8080/

Wed May 08 2013 08:05:21 GMT-0500 (Central Daylight Time) RaLF Server is listening on port 8081

Now point your modern, HTML5-compliant browser at http://localhost:8080

Options

The following can be passed as command line options when starting the server.

| Option | Default | Description | | ------ | ------- | -------- | | --httpServer | true | Run the web server, set false if you just want the ETP Websocket server | | --httpPort port | 8080 | Web Server Port | | --wsPort port | 8081 | Websocket Port | | --schemas | lastest | Name of the RaLF schema file to use. Look in the schema folder for .avpr files, any can be used | | --autoSubscribe | false| Start pushing data without a subscription | | --defaultSubscription | | Name of a Uri to use when auto-publishing. | | --databaseConnectString | mongodb://localhost:27017/witsml | mongodb connection string | | --traceMessages | false | Creates a disk log of each message sent and received by the server. | | --traceDirectory | trace | Name of the folder to hold the trace files. | | --help | n/a | print this information |

Recording Clients

The server now has the ability to record streaming data from other servers, store it in the database and relay the points to any subscribed clients, essentially acting as an aggregator. To enable this feature:

  1. Create a config directory under the main etp-server folder.

  2. Create a file called 'recorders.json'

  3. It should contain a single json array of servers you would like to connect to.

     [
     	{
     		"url": "ws://localhost:8082",
     		"encoding": "binary",
     		"retryInterval": "20000",
     		"active": true,
     		"contextUri": "eml:///witsml1411/log(LOUIS-1)"
     	},
     	{
     		"url": "ws://simplestreamer.cloudapp.net",
     		"encoding": "binary",
     		"retryInterval": "20000",
     		"active": true,
     		"contextUri": "eml:///witsml1411/log(SimpleStreamer-1)"
     	},
     	{
     		"url": "ws://192.168.1.51:82",
     		"encoding": "binary",
     		"retryInterval": "20000",
     		"active": true,
     		"contextUri": "eml:///witsml1411/log(BOROMIR-1)"
     	}
     ]

Fields in recorders.json:

url

The address and port of the server to connect.

encoding

Currently only supports binary.

retryInterval

If the server is not available, or goes down during the connection, the recording client will attempt to retry at this interval (in milliseconds). Set this value to 0 if you don't want to retry at all.

active

If this is set to false, aggregating server will not even try to connect.

Utilities

In addition to the main server application, there are a number of stand-alone utilities to help load the data base and provide simulated data for various ETP configuration scenarios.

--

perfServer

perfServer.js (located in the bin directory) is a convenient way of creating a simple streaming server. It use the windows perf counters as a data source (and thus works only on windows machines) to generate channels at one second intervals. Use --help to see options. One of the options for perfServer is --skipDuplicates which will cause it to send data points only when the value of the perf counter changes. So you will still get a data set per second, but only of the values that have changed.

--

loadAll

loadAll.js (located in the bin directory) can be used to populate your server database with existing WITSML data. loadAll uses a pool of up to 10 or so processes to load the data set in parallel. There is a companion file loadOne.js, which is the file that is forked by the loadAll, and it can also be run stand-alone to load a single document. Loading the full data set can take 10-30 minutes depending on memory, cores, ssd, etc. on your machine.

--

logPlayer

logPlayer.js (located in the bin directory) can be used to simulate a real time feed, using a witsml1411 time log as input. The algorithm used is to read the time difference between successive rows in the data section, and then use setTimeout to send the next row at the appropriate time. There is a 'speed' parameter which is simply a divisor applied to the number of milliseconds between rows that allows you to speed up the simulator. Going above 1000 doesn't produce meaningful results. If you want to send the entire log as fast as possible, specify a speed of 0.

Known Issues

- Does not currently support re-connecting sessions.
- Only supports describing individual channels.