@cognite/grafana-migration-script
v0.0.25
Published
This script can be used to migrate existing Grafana dashboards from CDF Grafana plugin v1.0.x to v2.1.x+ Note: this script is not compatible with the plugin v2.0.0/v2.0.1. Consider upgrading to v2.1.0 directly.
Downloads
61
Readme
Grafana dashboard migration script
This script can be used to migrate existing Grafana dashboards from CDF Grafana plugin v1.0.x to v2.1.x+ Note: this script is not compatible with the plugin v2.0.0/v2.0.1. Consider upgrading to v2.1.0 directly.
Requirements
- Grafana instance with Cognite Data Fusion plugin v1.0.x
- Grafana api key with admin access.
CDF Api keys
for the projects used on the instance (with capabilities:timeseriesAcl:READ
). Note: You don't need to know project names in advance, script will ask for them itself.- Any computer with
nodejs v10
installed. The script uses APIs and doesn't need to be executed directly from the instance.
Step #0
Upgrade Cognite datasource plugin to v2.0.0+
Step #last
run npx @cognite/grafana-migration-script --grafanaUrl <grafana-url> --grafanaApiKey <grafana-api-key>
where:
<grafana-url>
– url to your Grafana instance (E.g. "http://localhost:3000"
)
<grafana-api-key>
– token that you need to generate from your Grafana instance. Check Grafana Documentation (E.g. eyJrIj...oxfQ==
). Api key is always attached to a specific organisation. In order to migrate multiple organisations you need to execute npx @cognite/grafana-migration-script
for each api key.
Optional flags:
--rollback
– to revert changes caused by migration script
--dryRun
– to test migration script without writing back to grafana instance
Note: npx
comes bundled with nodejs
, no installation required.
- After that you'll be prompted for Cognite API-key for each project. Enter API-keys one by one, or leave the field empty to skip specific project.
After successful script execution you should see those kind of messages in console output:
dashboard-name – status: SUCCESS
The script will also print difference between dashboard JSON configurations.
Debugging
If something goes wrong during migration there are a few options to do.
- If dashboards are severely damaged the easiest way would be to revert them back to previous state.
Execute:
node dist/main.js --grafanaUrl <grafana-url> --grafanaApiKey <grafana-api-key> --rollback
Note: avoid executing this too many times, grafana has a limit for amount of stored versions (default: 20) - If minor issues where discovered, it might be better to fix them by editing via Grafana UI. After each script execution
debug-[[timestamp]].log
is created. This file contains error messages and JSON diffs, so you can use it in order to manually fix broken dashboards. - If any questions, please contact our support team on slack or via [email protected].
Under the hood
This script translates JSON dashboards configuration from CDF Grafana plugin syntax v1.0.x to v2.0.0.
Changes
Time series changes
target: [timeseries_name]
replaced with corresponding timeseries id- custom queries syntax changes
| Feature | v1.0.x | v2.0.0 |
|---|---|---|
| Root asset | Value from input field | Part of a custom query filter syntax. E.g. ts{assetSubtreeIds=[{id=ASSET_ID}]}
|
| Time series reference | [ID]
or [ID, AGGREGATE, GRANULARITY]
| ts{id=TS_ID}
or ts{id=TS_ID, aggregate='AGGREGATE', granularity="GRANULARITY"}
|
| Filter with aggregates | timeseries{}[AGGREGATE, GRANULARITY]
| ts{aggregate='AGGREGATE', granularity='GRANULARITY'}
| Name filter | timeseries{name='TS_NAME'}
| ts{externalId=TS_EXT_ID}
or ts{id=TS_ID}
|
| Name RegExp filter | timeseries{name=~'.*regex.*'}
| ts{name=~'.*regex.*'}
|
| Metadata filters | timeseries{metadata.type=~'.*well.*', metadata.tag='1'}
| ts{metadata={type=~'.*well.*', tag='1'}}
|
| Math functions | sum
, max
, min
, avg
, pow
, sin
, cos
, pi
, sqrt
, abs
, exp
, ln
, round
. E.g. sum(timeseries{})
or SUM(timeseries{})
| Lower-cased. E.g. sum(ts{})
. In addition power
is converted to pow
|
| Unsupported functions | acos
, asin
, atan
, ceil
, cot
, degrees
, floor
, log
, log2
, log10
, radians
, sign
, tan
, atan2
, mod
, truncate
| Omitted by the script, not supported in API v1 |
| Aggregation shortcuts | avg
, int
, step
, cv
, dv
, tv
| Changed to full versions: average
, interpolation
, stepInterpolation
, continuousVariance
, discreteVariance
, totalVariation
. If these were used in variables, like [TS_ID, $aggregate]
the script will try to migrate them as well.|
| Non-quoted filters | timeseries{description=VAL}
| ts{description="VAL"}
|
Custom queries use synthetic time series feature from API v1. Use this link to find out more about synthetics.
Template variable changes
query
andfilter
fields replaced with all-in-one query. Instead of separateasset{}
andfilter{}
same result can be achieved withassets{}
syntax.
| Feature | v1.0.x | v2.0.0 |
|---|---|---|
| Asset sub-trees | asset{assetSubtrees=[ID1,ID2]}
| assets{assetSubtreeIds=[{id=ID1}, {externalId='EXT_ID2'}]}
|
| Metadata query | asset{metadata={"key": "VAL1", "key2": "VAL2"}}
| assets{metadata={key='VAL1', key2='VAL2'}}
|
| Timestamps query | Filters: minCreatedTime
, maxCreatedTime
, minLastUpdatedTime
, maxLastUpdatedTime
. E.g. asset{lastUpdatedTime=0}
| Similar to API v1 assets{lastUpdatedTime={min=0, max=1}}}
|
| Filter | filter{name=~".*"}
| Now is a part of the query: assets{name=~'.*'}
|
| Unsupported properties | sort
, dir
, offset
, boostName
| Not supported, the script will remove them from query |
Annotation changes
Similar to templates query
and filter
fields replaced with one query. Instead of separate event{}
and filter{}
new events{}
syntax is used.
Filtering syntax is changed the same way as for template variables, use the comparison table above for examples.
Other JSON changes
- Properties removed:
func
,assetQuery.templatedTarget
,assetQuery.func
,assetQuery.timeseries
target: 'Start typing tag id here'
replaced with an empty string