@esdl/service
v0.1.7
Published
The main ESDL (Energy System Description Language) service entry point, offering services to query and retrieve ESDL data.
Downloads
2
Readme
@esdl/service
The @esdl/service
is a micro-service that can serve ESDL (Energy System Description Language) Energy Systems from a database (using the ``@esdl/ormmodule, where ORM stands for Object Relational Mapping) or CSV file (using the
@esdl/csvmodule). Using configuration over convention, a savvy non-developer should also be able to prepare his data so it can be served in a Docker container. The micro-service allows you to query the data via a REST (OpenAPI or Swagger) interface, available via
http://HOST:PORT/swagger.json. Currently, there are three endpoints at
http://HOST:PORT/api`:
- Query the ESDL in a certain region, e.g.
/district/:id
- Query the ESDL in a certain region, and aggregate the data over a sub-region, e.g.
/district/neighbourhood/:id
. - Get the capabilities of the service
/capabilities
This also requires the exposed data to contain a geographic reference, for example a coordinate or the name of the area. In the following, you will find a description of how to make your energy data CSV available as ESDL.
Installing prerequisites
The easiest way is to fork and clone this repository, install all dependencies and build the code.
From the directory where you cloned the repository, run the commands from the build instructions listed here (repeated here for convenience)
npm i -g yarn # If yarn isn't installed already
npm i # On linux, you may run into permission issues when registering the cli package.
Then run the following command twice (!!!), to solve some dependency issues:
npm run dev
Exposing a CSV file as ESDL service
Assuming you have a CSV file that contains energy related information and you want to expose it as ESDL, you would need to:
- Configure the ESDL service
- Optionally, reformat the CSV a bit so it can be more easily consumed
- Hosting the ESDL service
Configuring the ESDL service
In the project's config
folder, you can find many examples of existing sources that can serve as a baseline for your new service. As a first example, I'll use zonnestroom.json
, and explain what you need to do to re-use it and expose your solar power CSV.
Some example data is presented in the table below. Please note that the source represents geographical location in the naam
column.
|naam|type|aantal_installaties|vermogen|woningen|naam_w|bedrijven|naam_b| |-------|-------|-------|-------|-------|-------|-------|-------| |Nederland|country|549505|2807377000|1602312|PV op woningen|1205066|PV op bedrijven| |Drenthe|province|31273|160683000|97130|PV op woningen|63553|PV op bedrijven|
- In the
config
folder, clone thezonnestroom.json
to a new file, e.g.solarpower.json
- In
solarpower.json
, replace the serviceName property withsolarpower
. - As we are dealing with a CSV file, the
pluginService
is@esdl/csv
. Alternatively, there is also a@esdl/orm
plugin for exposing data from a database. - The
regionsBaseUrl
should point to a micro-service that exposes regional data, containing the boundaries of areas (municipalities, districts, etc.). - In the
rest
property, you specify the name of your service, and the port that you wish to expose. - In the
csv
section, you first specify the filename
: path to the filedelimiter
: e.g. "," or ";"spatialType
: can be 'point' (default) or 'name'. In the former case, the data source should have a point location, otherwise an area's name is used. In the latter case, you also need to specify the following two properties. Internally, the following check is performed when filtering the data:item[spatialColumn] === name && item[spatialTypeColumn] === regionType
. This is needed in order to separate, for example, the city of Groningen from the province of Groningen.spatialTypeColumn
: column name that contains the spatial type, e.g. province, municipality, district, or neighbourhood. The values need to be represented in theregions
micro-service mentioned above.spatialColumn
: column name that contains the name of the area.- Next, you define the ESDL entities that you want to expose. Note that you can have more than one entity for one CSV file.
name
: The ESDL entity it represents, such as aWindTurbine
,AggregatedBuilding
or, in this case,PVPanel
.description
: A descriptioncolumns
: For each entry, the key represents the name of the ESDL attribute, socolumns.name: { "name": "naam_w", "type": "string"}
indicates that the CSV columnnaam_w
will be exposed in ESDL asname
. E.g.<PVPanel name="..."></PVPanel>
. In case thename
attribute is missing, it is supposed that the column name is the same.
Object and column transformers
|VID|Turbine|Gemeente|Fabrikant|Provincie|Type|Latitude|Ashoogte|Longitude|Diameter|Startjaar|Vermogen (kW)|Windpark| |----|----|----|----|----|----|----|----|----|----|----|----|----| |36241|Loppersum solitair|Loppersum|Lagerwey|Groningen|Onbekend|53.331546|24|6.635759|11|1982|15| |36242|Visafslag|De Marne|Lagerwey|Groningen|LW 15/75|53.407216|25|6.203135|16|1988|75|
As another example, consider windstats.json
: this source contains the location of wind turbines as a latitude|longitude pair. For that reason, the entities
property contains an optional property, transformer
, referring to a LatLonPointTransformer
which is an object transformer that transforms the latitude lat
and longitude lon
columns to an ESDL point.
In addition, there are also column transformers present, such as the StringTransformer
, which transforms an integer to a string, which is required for ESDL id attributes.
"id": {
"type": "int",
"name": "VID",
"transformer": "StringTransformer"
},
Available transformers:
StringTransformer
(column transformer): transforms a number to a string.NumberTransformer
(column transformer): transforms a string to a number.GeoJSONTransformer
(column transformer): transforms a GeoJSON object to a string usingJSON.stringify
.LatLonPointTransformer
(object transformer): transforms alat
,lon
property to a GeoJSON Point. It is assumed thatlat
andlon
are in WGS84. The originallat
andlon
property are removed from the output.
Starting a new service
Locally
The commands below should be run from the ./packages/esdl-service
directory.
To test whether your service is working, run it locally:
ESDL_CONFIG=./config/YOUR_SERVICE_NAME.json npm run start:prod
Or on Windows:
set ESDL_CONFIG=config\YOUR_SERVICE_NAME.json
npm run start:prod
node dist\main.js
This will transpile the TypeScript code into JavaScript, and run node dist/main.js
. Next, you should be able to access your data from http://localhost:PORT/api
, where PORT
is the port that you configured in the JSON file.
Docker
When everything is working as expected, you can create a Docker image and, optionally, host it. In the below example, replace YOUR_NAME, YOUR_SERVICE_NAME and YOUR_PORT with the configured values in the config file. In case your CSV file is not stored in the data
folder, you need to adjust the source location as well. See the ../docker
folder for other examples.
FROM timbru31/java-node as builder
COPY . .
RUN npm install typescript json-schema-to-typescript jest -g
RUN yarn
RUN yarn build
FROM node:8-slim as production
LABEL Author="YOUR_NAME"
WORKDIR .
# Copy ALL node_modules
COPY --from=builder node_modules node_modules
# Copy linked node_modules
COPY --from=builder ./packages/esdl-core/dist node_modules/@esdl/core/dist
COPY --from=builder ./packages/esdl-core/mappings node_modules/@esdl/core/mappings
COPY --from=builder ./packages/esdl-core/package.json node_modules/@esdl/core/package.json
COPY --from=builder ./packages/esdl-csv/dist node_modules/@esdl/csv/dist
COPY --from=builder ./packages/esdl-csv/package.json node_modules/@esdl/csv/package.json
COPY --from=builder ./packages/esdl-service/data data
COPY --from=builder ./packages/esdl-service/package.json ./
COPY --from=builder ./packages/esdl-service/config/YOUR_SERVICE_NAME.json ./config/YOUR_SERVICE_NAME.json
COPY --from=builder ./packages/esdl-service/dist dist
ENV NODE_ENV production
ENV ESDL_CONFIG ./config/YOUR_SERVICE_NAME.json
# Expose the REST interface (uses the same port as in the config file's REST section.)
EXPOSE YOUR_PORT
CMD [ "node", "dist/main.js" ]
Now you are ready to build a new Docker container image:
$ docker build -f ./docker/YOUR_SERVICE_NAME/Dockerfile -t YOUR_SERVICE_NAME .
$ docker run -it -p YOUR_PORT:YOUR_PORT YOUR_SERVICE_NAME bash
In case YOUR_PORT
is not available on your machine, you need to map it to a different port, e.g. 1234, as in -p 1234:YOUR_PORT
.
Docker Swarm
There is also an npm script
available for running your image in Docker Swarm (you need to run it from the root folder of your cloned project). For example, if you want to expose YOUR_SERVICE_NAME
, running on port 4567, on port 1234 in the Swarm, you need to run:
> SVC=YOUR_SERVICE_NAME EXTPORT=1234 INTPORT=4567 npm run docker
# For example
> SVC=zonnestroom EXTPORT=4004 INTPORT=4004 npm run docker
> SVC=windstats EXTPORT=4003 INTPORT=4003 npm run docker
NOTE: The first time, call npm run docker1
, which will not try to kill an existing service (and fail, as it isn't there).