sigfox-gcloud-data
v1.0.3
Published
sigfox-gcloud adapter for writing Sigfox messages into databases
Downloads
5
Maintainers
Readme
sigfox-gcloud-data is a sigfox-gcloud
adapter for writing Sigfox messages into SQL databases like MySQL and Postgres.
You may read and update Sigfox messages with other modules (such as
sigfox-gcloud-ubidots
)
before passing to sigfox-gcloud-data
for writing to the database.
sigfox-gcloud-data
works with most SQL databases supported by
Knex.js
like MySQL, Postgres, MSSQL, MariaDB and Oracle.
sigfox-gcloud-data
was built with sigfox-gcloud
, an open-source software framework for building a
Sigfox server with Google Cloud Functions and Google Cloud PubSub
message queues. Check out sigfox-gcloud
sigfox-gcloud-data
with MySQL:
sigfox-gcloud-data
with Postgres:
Releases
- Version 1.0.1 (14 Oct 2017): Supports multiple instances
Getting Started
For development we support Linux, MacOS and Ubuntu on Windows 10.
Open a command prompt and enter these commands to download the sigfox-cloud-data
source folder to your computer.
git clone https://github.com/UnaBiz/sigfox-gcloud-data.git
cd sigfox-gcloud-data
If you're using Ubuntu on Windows 10, we recommend that you launch "Bash on Ubuntu on Windows" and enter the following
commands to download the source files into the folder /mnt/c/sigfox-gcloud-data
:
cd /mnt/c
git clone https://github.com/UnaBiz/sigfox-gcloud-data.git
cd sigfox-gcloud-data
That's because /mnt/c/sigfox-gcloud-data
under bash
is a shortcut to c:\sigfox-gcloud-data
under Windows.
So you could use Windows Explorer and other Windows tools to browse and edit files in the folder.
Remember to use a text editor like Visual Studio Code that can save files using
the Linux line-ending convention (linefeed only: \n
),
instead of the Windows convention (carriage return + linefeed: \r \n
).
Setting up Google Cloud
Install
sigfox-gcloud
with the base modules (exclude optional modules):https://github.com/UnaBiz/sigfox-gcloud/blob/master/README.md
Open a bash command prompt. For Windows, open "Bash on Ubuntu on Windows."
Create a file named.env
in thesigfox-gcloud-data
folder
and populate theGCLOUD_PROJECT
variable with your project ID. To do that, you may use this command (changemyproject
to your project ID):cd sigfox-gcloud-data echo GCLOUD_PROJECT=myproject >.env
Add the following
sigfox-route
setting to the Google Cloud Project Metadata store. This route says that all received Sigfox messages will be processed by the two stepsdecodeStructuredMessage
andsendToDatabase
.gcloud compute project-info add-metadata --metadata=^:^sigfox-route=decodeStructuredMessage,sendToDatabase
If you're using
sigfox-gcloud-ubidots
, thesendToDatabase
step should appear last so that the updates fromsendToUbidots
will be recorded in the database.gcloud compute project-info add-metadata --metadata=^:^sigfox-route=decodeStructuredMessage,sendToUbidots,sendToDatabase
Create the Google PubSub message queue that we will use to route the Sigfox messages between the Cloud Functions:
gcloud beta pubsub topics create sigfox.types.sendToDatabase
sigfox.devices.sendToDatabase
is the queue that will receive decoded Sigfox messages to be sent to data via the data APIGo to the Google Cloud Metadata screen to define your database settings:
https://console.cloud.google.com/compute/metadata
sigfox-dbclient
: Database client library to be used e.gmysql
,pg
. Check this page for the library: http://knexjs.org/#Installation-nodesigfox-dbhost
: Address of database server e.g.127.127.127.127
sigfox-dbuser
: User ID for accessing the database e.g.user
sigfox-dbpassword
: Password for accessing the database.sigfox-dbname
: Name of the database that will store the sensor data. Defaults tosigfox
sigfox-dbtable
: Name of the table to store sensor data. Defaults tosensordata
sigfox-dbversion
: Version number of database, used only by Postgres, e.g.7.2
If the
sigfox-dbtable
table above does not exist, it will be created automatically.Install the database library if you are NOT using MySQL or Postgres. Check this page for the library to be used:
http://knexjs.org/#Installation-node
Then run the command
npm install LIBRARYNAME --save
. For example if you're using MSSQL, you would run this command:npm install mssql --save
Deploy the
sendToDatabase
Cloud Function with thedeployall.sh
script:chmod +x */*.sh scripts/deployall.sh
How it works
Sigfox messages are pushed by the Sigfox Cloud to the Google Cloud Function
sigfoxCallback
Cloud Function
sigfoxCallback
delivers the message to PubSub message queuesigfox.devices.all
, as well as to the device ID and device type queuesCloud Function
routeMessage
listens to PubSub message queuesigfox.devices.all
and picks up the new messageCloud Function
routeMessage
assigns a route to the Sigfox message by reading thesigfox-route
from the Google Compute Metadata Store. The route looks like this:
decodeStructuredMessage, sendToDatabase
This route first sends the message to function
decodeStructuredMessage
via the queuesigfox.types.decodeStructuredMessage
decodeStructuredMessage
contains the logic to decode a compressed message format that we call Structured Message Format. Within a 12-byte Sigfox message, the Structured Message Format can encode efficiently 3 sensor field values and their sensor field names.For example, the encoded 12-byte message
b0513801a421f0019405a500
contains 3 sensor values (temperature, humidity, altitude) and their field names:tmp = 31.2, hmd = 49.6, alt = 16.5
According to
sigfox-route
above, the resulting decoded message is sent next to functionsendToDatabase
via the queuesigfox.types.sendToDatabase
sendToDatabase
appends the received Sigfox message to thesensordata
table that you have defined in the Google Cloud Metadata settings. It calls the Knex.js library to update the database.sendToDatabase
automatically matches the received Sigfox message fields with thesensordata
fields. So if your Sigfox message includes a new field (perhaps by decoding a Structured Message) and thesensordata
table also contains a field by that name,sendToDatabase
will write the new field into thesensordata
table.See this doc for the definition of Structured Messages:
https://unabiz.github.io/unashield/
To understand how Structured Messages may be used with the Ubidots IoT platform, check the UnaShield Tutorial for Ubidots:
https://unabiz.github.io/unashield/ubidots
Viewing sigfox-gcloud-data
server logs
You may view the logs through the
Google Cloud Logging Console
Select "Cloud Function" as the "Resource"
From the screen above you can see the logs generated as each Sigfox message is processed in stages by sigfox-gcloud
:
Sigfox Device IDs are shown in square brackets e.g.
[ 2C30EB ]
Completed Steps are denoted by
_<<_
sigfoxCallback
is the Google Cloud Function that listens for incoming HTTPS messages delivered by SigfoxrouteMessage
passes the Sigfox message to various Google Cloud Functions to decode and process the messagedecodeStructuredMessage
decodes a compressed Sigfox message that contains multiple field names and field valuessendToDatabase
would appear afterdecodeStructuredMessage
.sendToDatabase
writes the decoded sensor data to the database via the Knex.js library.
Tracing sigfox-gcloud-data
server performance
The Google Cloud Trace Console shows you the time taken by each step of the Sigfox message processing pipeline, tracing the message through every Google Cloud Function.
Each message delivered by Sigfox appears as a separate trace timeline. Messages are shown like 2C30EB seq:1913
where 2C30EB
is the Sigfox Device ID and 1913
is the Sigfox Message Sequence Number (seqNumber)
The Google Stackdriver Trace API needs to be enabled manually.
Custom reports may be created in Google Cloud Trace Control to benchmark the performance of each processing step over time.
Understanding and troubleshooting the sigfox-gcloud-data
server
To understand each processing step in the sigfox-gcloud-data
server, you may use the
Google Cloud Debug Console
to set breakpoints and capture in-memory variable values for each Google Cloud Function, without stopping or reconfiguring the server.
In the example below, we have set a breakpoint in the sigfoxCallback
Google Cloud Function. The captured in-memory
values are displayed at right - you can see the Sigfox message that was received by the callback.
The Callback Stack appears at the lower right.
Google Cloud Debug is also useful for troubleshooting your custom message processing code without having to insert the debugging code yourself.
Testing the sigfox-gcloud-data
server
Send some Sigfox messages from the Sigfox devices. Monitor the progress of the processing through the Google Cloud Logging Console.
Select "Cloud Function" as the "Resource"Processing errors will be reported to the Google Cloud Error Reporting Console.
We may configure Google Cloud Stackdriver Monitoring to create incident reports upon detecting any errors. Stackdriver may also be used to generate dashboards for monitoring the PubSub message processing queues.
Demo
To send messages from a Sigfox device into your database, you may use this Arduino sketch:
https://github.com/UnaBiz/unabiz-arduino/blob/master/examples/send-light-level/send-light-level.ino
The sketch sends 3 field names and field values, packed into a Structured Message:
ctr - message counter lig - light level, based on the Grove analog light sensor tmp - module temperature, based on the Sigfox module's embedded temperature sensor
Alternatively, you may test by sending a Sigfox message from your Sigfox device with the
data
field set to:920e82002731b01db0512201
We may also use a URL testing tool like Postman to send a POST request to the
sigfoxCallback
URL e.g. (changemyproject
to your Google Cloud Project ID)https://us-central1-myproject.cloudfunctions.net/sigfoxCallback
Set the
Content-Type
header toapplication/json
. If you're using Postman, clickBody
->Raw
->JSON (application/json)
Set the body to:{ "device":"1A2345", "data":"920e82002731b01db0512201", "time":"1476980426", "duplicate":"false", "snr":"18.86", "station":"0000", "avgSnr":"15.54", "lat":"1", "lng":"104", "rssi":"-123.00", "seqNumber":"1492", "ack":"false", "longPolling":"false" }
where
device
is your Sigfox device ID.Here's the request in Postman:
We may use the
curl
command as well. Remember to changemyproject
and1A2345
to your project ID and device ID.curl --request POST \ --url https://us-central1-myproject.cloudfunctions.net/sigfoxCallback \ --header 'cache-control: no-cache' \ --header 'content-type: application/json' \ --data '{"device":"1A2345", "data":"920e82002731b01db0512201", "time":"1476980426", "duplicate":"false", "snr":"18.86", "station":"0000", "avgSnr":"15.54", "lat":"1", "lng":"104", "rssi":"-123.00", "seqNumber":"1492", "ack":"false", "longPolling":"false"}'
The response from the callback function should look like this:
{ "1A2345": { "noData": true } }
The test message sent above will be decoded and written to your
sensordata
table asctr (counter): 13 lig (light level): 760 tmp (temperature): 29
The other fields of the Sigfox message will be written as well.
Adding one or more instances of sendToDatabase
It's possible to run 2 or more Cloud Functions that will update different databases. The Cloud Functions should be named:
sendToDatabase, sendToDatabase2, sendToDatabase3, ...
and the configuration for each function shall be set in the Google Cloud Metadata screen as
sigfox-dbclient, sigfox-dbclient2, sigfox-dbclient3, ...
For example, this metadata screen defines 2 databases settings for MySQL and Postgres:
To deploy the second instance of sendToDatabase
, edit the script scripts/deploy.sh
and uncomment the second functiondeploy
so it looks like:
./scripts/functiondeploy.sh ${name}2 ${localpath} ${trigger} ${topic}
Run scripts/deploy.sh
. This will deploy a new function sendToDatabase2
that uses the second database setting
in the Google Cloud Metadata screen.
To deploy sendToDatabase3
, sendToDatabase4
, ... you may edit scripts/deploy.sh
accordingly:
./scripts/functiondeploy.sh ${name}3 ${localpath} ${trigger} ${topic}
./scripts/functiondeploy.sh ${name}4 ${localpath} ${trigger} ${topic}
Note that all instances of sendToDatabase
will read Sigfox messages from the sigfox.types.sendToDatabase
queue simultaneously.
The database updates will run in parallel.