git-ops
v1.0.3
Published
Git with operations powers
Downloads
7
Maintainers
Readme
git-ops
git-ops is a set of tools to create auto operations contained in the repositories of projects that require them. The model adopted by this project is to create a simple git: "ops" subcommand that allows developers, devops, SREs to construct and execute their CI and CD stream independent of any tool or platform.
Motivation
This project arose from the desire to have control over the flows of continuous integration and continuous deployment/delivery autonomously as part of the project. Many CI/CD tools implement a form of streaming scripting, such as Jenkins, Travis, GitLab among others, but still the execution dependencies are integrated into them.
Some projects implement the concept of shared libraries to solve this problem, but still the pipelines depend on the execution tool. How to have 100% of the operations contained in the repository of a project? This is the challenge we are trying to solve.
Quick Start
- Go to releases and download the version that fits in your system
- Extract the executable into a folder that you can leave it there forever
- Execute the install command, example:
./git-ops-linux-x64 install
(you may need to runchmod +x git-ops-YOUR_OS-ARCH
before) - Now you have git-ops installed, execute the following command to get help:
git ops -h
How it's built
git-ops is a project written in Javascript for NodeJS 10 and compiled into an executable, so you do not need to install any dependencies to use git-ops, unless you're planning to use the npmInstall function to download dependencies for your JS steps, if this is the case, you need to have npm installed.
Tutorial
Understanding KDBX
A KDBX file is a specialized database for storing sensitive information, you can read more about the format here. Think of it as a vault for data. We recommend that you use KeePassXC to manipulate your KDBX files.
There are many advantages to keeping sensitive data in a KDBX file, but we will not delve into it here. You will realize that being a portable and secure database is perhaps the largest of them for git-ops.
In git-ops the KDBX files are used to:
- Store sensitive data (obvious)
- Store useful data to execute CI/CD flows (like a key-value or key-attachment database)
- Store scripts to execute the steps of pipelines
- Create shared operations assets
We're going to see that is a really powerful combination the use of git-ops and KeePassXC.
Creating my first ops pack
An ops pack is a set of data and scripts for running pipelines (from ops files). Let's see in practice following the documentation.
- In an empty folder create the following folder structure:
/kdbx /ops /MyOps /Js /Hash /Shell /Print
- Create the following file: /ops/MyOps/Js/Hash/pack.kdbx.yaml
entry: title: hashRepository data: - type: field name: title value: hashRepository - type: field name: hashAlgorithm value: md5 - type: field name: hashDirectory value: ../myAwesomeProject/**/* - type: attachment name: getFiles.js path: ./getFiles.js - type: attachment name: hashFiles.js path: ./hashFiles.js
- Create the file: /ops/MyOps/Js/Hash/getFiles.js
npmInstall('glob'); const path = require('path'); const glob = require('glob'); var files = glob.sync( path.join( __cwd, hashDirectory ), { nodir: true } );
- Create the file: /ops/MyOps/Js/Hash/hashFiles.js
const crypto = require('crypto') const fs = require('fs'); logger.info(`Files will be hashed using ${hashAlgorithm}`); const hash = crypto.createHash(hashAlgorithm); for (file of files) { logger.info(`Hashing file: ${file}`); hash.update( fs.readFileSync(file) ); } var filesHash = hash.digest('hex'); logger.info(`${files.length} hashed: ${filesHash}`); kdbx.setSourceValue( kdbx.KDBX_DB, { group: ['MyOps', 'Js', 'Hash'], entry: { title: 'filesHash' }, data: [ { type: 'field', name: 'title', value: 'filesHash' }, { type: 'field', name: 'filesHash', value: filesHash } ] } );
- Now we're going to create the shell entry: /ops/MyOps/Shell/Print/pack.kdbx.yaml
entry: title: printHash data: - type: field name: title value: printHash - type: attachment name: generateMd.sh path: ./generateMd.sh
- Create the shell script to print in a markdown file the generated hash: /ops/MyOps/Shell/Print/generateMd.sh
#!/bin/sh CURRENT_DATE=`date` echo "Hash at ${CURRENT_DATE}: ${FILES_HASH}" >> HASH_HISTORY.md
- Now that we have all the assets to build our KDBX we just need to go through terminal to / and execute the following command:
git ops pack -p MyPassword123 -k ./kdbx/MyOps.key ./kdbx/MyOps.kdbx ./ops
You'll see that git-ops generated the following files:
- /kdbx/MyOps.kdbx - The KDBX file with all content that we coded, you can open this file with KeePassXC to see what happened. This file is what we call as ops pack.
- /kdbx/MyOps.key - The secret key to open the KDBX file, git-ops pack will always generate a secret key if the specified file for -k option does not exist. Without this file and the password (option -p) the ops pack is useless. You don't need to specify a password if you just want to use a key file, this is useful for example when you're not creating an ops pack with sensitive data and commit the key file in the project repository, otherwise you should not commit the key file, this file is like a SSL certificate's secret key. To make sure that you don't commit this file accidentally we recommend you to put the kdbx/*.key pattern in your .gitignore.
Creating my first ops file
An ops file is a file containing specifications of pipelines that are known as "operations". In fact it is the equivalent of a Jenkinsfile, .gitlab-ci.yaml, .travis.yaml among other CI/CD scripting files.
- In an empty folder create the following folder structure:
/myAwesomeProject /cicd
- Let's create a test file to simulate a real project: /myAwesomeProject/hello.sh
#!/bin/bash echo 'Hello world!'
- Copy the MyOps.kdbx and MyOps.key into /cicd folder
- An finally we're going to create our ops file: /cicd/operations.yaml
operation: name: myFirstPipeline pipeline: - type: js source: group: - MyOps - Js - Hash entry: title: hashRepository value: type: attachment name: getFiles.js contextMap: - name: hashDirectory source: group: - MyOps - Js - Hash entry: title: hashRepository value: type: field name: hashDirectory - type: js source: group: - MyOps - Js - Hash entry: title: hashRepository value: type: attachment name: hashFiles.js contextMap: - name: hashAlgorithm source: group: - MyOps - Js - Hash entry: title: hashRepository value: type: field name: hashAlgorithm - type: bash source: group: - MyOps - Shell - Print entry: title: printHash value: type: attachment name: generateMd.sh envMap: - name: FILES_HASH source: group: - MyOps - Js - Hash entry: title: filesHash value: type: field name: filesHash
We can notice some things with ops file. The first is that we use the KDBX entries as sources for the codes that will run in the pipeline. The second is that we can map KDBX fields and attachments as environment variables or execution context (when it is a Javascript script). The execution context of a Javascript script is global scope, ie any variable created in the script will be in the pipeline execution context, this is how we pass the files listed in the first script to the second.
Using git-ops
Now that we have our ops pack and our ops file we can finally perform our operations. To do this, navigate through a terminal to / from the previous step, execute the following command:
git ops execute -x ./cicd/MyOps.kdbx -p MyPassword123 -k ./cicd/MyOps.key ./cicd/operations.yaml myFirstPipeline
And so you ran your first pipeline using git-ops!
Let's also learn one more cool thing: you can customize ops packs without having to change their source code.
- Create a KDBX file with KeePassXC in the /cicd folder and create an entry in the Root group with the title as hashAlgorithm and an additional attribute named hashAlgorithm with the value sha1, you can use the same key of the previous KDBX to encrypt your file
- Change the the contextMap of the second step of the pipeline to get the value from our new KDBX:
contextMap: - name: hashAlgorithm source: group: - Root entry: title: hashAlgorithm value: type: field name: hashAlgorithm
- And execute your operation now passing the two KDBX files:
git ops execute -x ./cicd/MyOps.kdbx,./cicd/YOUR_KDBX.kdbx -p MyPassword123,YOUR_PASSWORD -k ./cicd/MyOps.key,./cicd/MyOps.key ./cicd/operations.yaml myFirstPipeline
And now we are generating the hashes with the sha1 algorithm. This feature is very useful for for example you specify values specific to the CD (API keys, certificates, secrets among others). When you specify more than one KDBX for the execute command all of them are merged, with the rightmost files overwriting the left ones. Note that you also pass passwords and comma-separated keys.
JS VM Context
When you are executing JS code as step of pipelines, the code is executed using NodeJS VM module. The provided context contains the following globals:
__cwd
- A constant to store the current directory of the pipeline, this value should always be the directory of the ops file.npmInstall(...dependencies)
- A synchronous function to download dependencies for your script, these dependencies are ephemeral and will be deleted after the step execution.logger
- Instance of bunyan to log actions from your script.kdbx.KDBX_DB
- Instance of the main KDBX database (see kdbxweb package), this will always be the first loaded KDBX file.kdbx.KDBX_DBS
- Object as key-value mapping to all instances of loaded KDBXs.kdbx.setSourceValue(db, sourceSpec)
- Function to set values into loaded KDBXs instances.kdbx.save(kdbxDb, directory)
- Asynchronous function to save KDBXs instances into files.endPipelineStep([error])
- If you're running asynchronous code inside the JS VM you must call this callback function when the execution is done, this callback function will make the git-ops wait for the execution of an asynchronous pipeline step.- And every context value that you mapped. Note that if you declare a var like
var x = 1
this value will be available on the next JS VM scripts.