@f5-pwe/kog
v1.5.9
Published
Process Workflow Engine runner
Downloads
45
Maintainers
Readme
Overview
Kog - process workflow engine. A cross-platform CLI to execute a PWE job.
Install with npm
npm install -g @f5-pwe/kog
# When running as root user:
npm install -g -unsafe-perm @f5-pwe/kog
Verify you can run kog
npx kog run -f https://gitlab.com/f5-pwe/kog/raw/master/data/test/success.yaml
Execute Jobs
Kog supports executing a job either via docker directly or as a kubernetes job
--executor docker
default.- Set in
DOCKER_HOST
environment variable to a valid docker host, for exampletcp://localhost:2375
- Set in
--executor k8s
- This uses the
KUBECONFIG
env var and the current context to determine which cluster and namespace to use
- This uses the
Workflows
There are few workflows checked in under data
folder. The simplest workflow to execute is data/test/success.yaml
, which executes echo command in alpine container.
npx kog run -f https://gitlab.com/f5-pwe/kog/raw/master/data/test/success.yaml
npx kog run -f /local/path/to/workflow.json
The --file|-f
flag works for both remote and local files encoded in either json or yaml
Workflow Action Container Specification
- Containers MUST have an entrypoint that reads the
KOG_CONTEXT
env var as a base64 encoded json serialized Context - Containers MUST print to stdout a prefixed line that contains the json serialized ActionResult
- eg
KOG:{"result":"success","rc":0,"message":"auto success","context":{"action_name":"auto-success","correlation_id":"btzb6nwmekwj","run_id":"btzb6nwmengl","step_name":"start","task_name":"success test"}
- the
context
from the ActionResult is passed into the next step in the workflow
- eg
- IF you support the docker executor type your Docker file MUST have an ENV
KOG_ENV_CONTEXT=true
- Example Dockerfile
- This is just to tell the docker executor to not use
stdin
to pass the context, this is how it used to be done and I did not want to break backwards compatibility
Context
Every kog job and step has a Context that it is executed within. This context is passed to each workflow step, which can then use this context to configure details about how this step is configured. Each step returns a context that is used in the next step, so it is common to put results from this step into the context before returning it so the next step can use those details
--context /path/to/context.json
The --context
flag works for both remote and local files encoded in either json or yaml
Environment Variables
There are three was to pass env vars to a workflow step
--env VARIABLE=something
--env-file /path/to/data.env
this works for both remote and local files in the dotenv file format--import-current-env
you SHOULD always use this with--ignore-env
to exclude env vars you do NOT want passed to your workflow step
On docker the env variables are all passed in via --env
and is thus inherently insecure, do NOT pass passwords, tokens, etc this way unless you are 100% running on a local docker daemon and are 100% sure the box is secure
On kubernetes all the env variables are saves as a kubernetes Secret then loaded into the job via the normal kubernetes methods. This is only remotely more secure then the docker method, so still do not pass passwords or tokens this way.
Volume Mounts
The docker executor supports mounting volumes into each step action, the symantics are identical to the docker --volume
flag
Kubernetes does not yet support volume mounting. If you are interested in this feature feel free to do a MR
Logging
By default, Kog logs to console in logfmt format. It is possible to switch format to JSON by settings --log-formatter
flag to json
.
The log settings will also be propagated to all spawned containers using passed context.
Remote Logging
To enable log aggregation, it is useful to set --remote-logger
flag to syslog
. You can also specify the remote logger address --remote-logger-addr 10.0.0.15:514
which is useful if you are passing your logs to logstash or some other log aggregation tool
Enabling remote logger does not disable local logs, but adds a handler to ship logs to a remote location.
Status Notification
To enable step result notifications, set --notifier
flag to either console
, or https://url.example.com/kog/notifications
This will enable sending a notification on every step start to keep track of a job progress. You can specify multiple notifiers for each run
HTTP Notifiers
passing a url to the notifier flag will enable kog to make an http POST request to that URL with the json serialized pwe.Job object as the body
Return Codes
By default the kog run
command will use the job status to map to return codes, use --use-job-rc=false
to always return 0 (except for issues with the kog command itself)
- 127 - error with the kog command itself
- Job Status Return Codes