@apica-io/asm-auto-deploy
v4.6.0
Published
Auto deployment of checks and associated scenarios into Apica ASM
Downloads
33
Keywords
Readme
- ASM Auto Deploy tool
- Required roles in Apica ASM
- The deployment descriptor
- Deployment descriptor JSON Schema
- Integrations
- Known bugs in this release
- The run scenario tool
ASM Auto Deploy tool
Introduction
ASM Auto Deploy (asm-auto-deploy) will help you with automatic deployment of assets into the Apica Synthetic Monitoring(ASM) platform. It is using the ASM Rest API to do the deployment without any manual interactions. The tool will use what we call a deployment descriptor for management of an automatic deployment.
You can run asm-auto-deploy as a command line tool downloaded with npm or as a task in Azure DevOps. See https://marketplace.visualstudio.com/items?itemName=apica.asm-auto-deploy.
When you use it as a command line tool it can be integrated with your own deployment processes and tools. It is also a good practices to test the deployment descriptor with the command line tool before you integrate with Azure DevOps or Jenkins
Command line interface
asm-auto-deploy -h
Usage: index [options]
Options:
-V, --version output the version number
-f, --file <file> The deployment descriptor file
-w, --workDir <workDir> Working directory
-s, --sourceDir <sourceDir> Source directory
-rep, --repository <repository> Source repository
-t , --authTicket <authTicket> Authentication ticket. Override value in descriptor
-zh, --zebraHome <zebraHome> Zebra Tester home directory
-l, --logLevel <logLevel> Log level in log4js. Use DEBUG for more output, default is INFO
-r, --run run checks
-at, --apiTrace Trace API calls
-b, --bail Stop on first error
-u, --update Update deployment descriptor
-am, --message ASM Message after deployment
-m, --modules <modules> The modules to execute [all,checks,scenarios] . Default all. Comma separated list
(default: "checks,scenarios,run-only")
-h, --help display help for command
Reports created by asm-auto-deploy
Two reports are created by asm-auto-deploy. The reports are stored in the working directory. The name $deployment.name coming from the name property in the deployment descriptor.
| Report | File name | Comments
| -------| ------------|----
| JSON | $deployment.name.json | A detailed report in json format.
| Markdown | $deployment.name.json | An overview in markdown format. It is the published report when used in a CI/CD pipeline
| Logs | *.log | Log files are stored in the logs sub directory
Git repository support
You can use git repositories as source. When you use repositories as source the deployment descriptor should be stored in the repository. Changes to the deployment descriptor will be committed to the repository. The source directory will be the directory for the clone of the repository. The cloned directory is not deleted after a run. Can be good practices to delete the directory manually or by a script.
Repository Credentials
Credentials can be specified in 2 ways.
- In the repository url using syntax: https://user:[email protected]/foo/bar
- Using the environment variable GIT_CREDENTIALS. Format username:token The recommendation is to use environment variable.
Required roles in Apica ASM
To run the _asm-auto-deploy the user associated with the authticket need the following access rights the target ASM account.
Monitor Group access right
Access rights to all top level monitor groups assigned to checks included in the *deployment descriptor*. This is valid if you are deploying checks in the deployment.
Required ASM Roles
These role roles are required for running the tool.
| Role | Optional | Comments
| ---| ----|----
| Check Admin | No | Required for all requests
| Alert Admin | Yes | Required for assigning alerts to checks
| Customer Admin | Yes | Required for assigning tags to checks
The deployment descriptor
Overview
The deployment descriptor is a JSON document containing meta-data about all assets which will be deployed to ASM. The asm-auto-deploy tool is using the ASM Rest API and the schema used in the deployment descriptor is based on the api. All properties in the api for create/update checks and scenarios. Use the API Documentation as a reference for creating deployment descriptors.
Concept with properties
The deployment descriptor have a hierarchic structure and you will always inherit fields properties from the parent layer for properties with the same name. Fields on a lower level override the parent value.
Complete API support
This new version are using the latest version of ASM Rest API and support all properties for creating and updating checks.
- All advanced options for browser checks
- All advanced options for zebra tester checks
Look at API Documentation for all properties per check type.
Deployment descriptor JSON Schema
https://files-apicasystem-com.s3-eu-west-1.amazonaws.com/schemas/asm-auto-deploy-schema-v1-0.json
Root section
| Property | Description
| ---| ----
| name | Name of the deployment descriptor
| description | A description of the deployment descriptor
| asmApiUrl | The base URL for your ASM rest API. Should end with the version of the api
| asmUIUrl | The URL to your ASM portal
| zebraScenarios | An JSON object containing meta-data for Zebra Tester scenarios to deploy
| browserScenarios | An array of names containing the browser scenarios to deploy
| checks | An JSON containing the meta-data about the checks to deploy
Zebra Scenarios section (name=zebraScenarios)
| Property | Description
| ---| ----
|compileOptions | Compile options for all zebra tester scenarios. See API documentation for information about compile options
| groups | An array of groups. A group has similar compile options
Groups
| Property | Description
| ---| ----
| compileOptions | Additional compile options for the group. Will be merged with the general compile options
| directory | An optional property for specifying the directory in the source where scenario sources are found
| scenarios | An array of zebra tester scenarios to deploy
| scenarios[...].name | Name of the scenario without extension
| scenarios[...].type | Type of scenario. zip or prxdat
| scenarios[...].plugins | Array of plugins used by the scenario. Should be a Java source code
| scenarios[...].files | Data files (csv) used by the scenario
| scenarios[...].subDirectory | An optional sub directory name, if scenario is stored in a sub directory
Zebra Scenario source options for zip files
There are two options for building a Zebra Tester archive in zip file format.
- The source directory can contain a valid Zebra Tester archive in an existing zip file. I must be in the format which ASM accepts. The zip file must contain 1 valid prxdat file and additional resources as plugins, data files etcetera. This option is required for zip files containing external jar files.
- The source directory contains a prxdat file and the additional resources. The additional resources are plugins java source files and data files. For compiling of plugins the Zebra Tester home directory must be set on the command line with zebraHome options or by setting the environment variable ZEBRA_HOME. The zip file will be built by compiling all plugins with the JDK found under the zebra home directory.
Browser Scenarios section (name=BrowserScenarios)
An array of browser scenarios to deploy. An item contains the file name of the scenario. HTML and Side format is supported. Side format often need modifications in ASM before they can be used as checks. This version also support and extended version for browser scenarios with a json object as items of the array.
"browserScenarios": [
{
"name": "RobotShop_search_deploy.html",
"browserName": "chromium",
"locationMatch": "Frankfurt",
"minVersion": 74,
"url": "http://ec2-52-57-166-83.eu-central-1.compute.amazonaws.com:8080",
"run": true
},
{
"name": "RobotShop_session_deploy.html",
"browserName": "chromium",
"locationMatch": "Frankfurt",
"minVersion": 74,
"url": "http://ec2-52-57-166-83.eu-central-1.compute.amazonaws.com:8080",
"run": false
},
{
"name": "google_test.html",
"browserName": "chromium",
"locationMatch": "Dublin",
"minVersion": 74,
"url": "https://www.google.com",
"run": true
}
],
If you are using the json object you can test browser scenarios directly in ASM without creating a checks.
Checks section (name=checks)
The checks section is divided into an array of groups. Each group has common meta-data for the group.
| Property | Description
| ---| ----
| name | A descriptive name of the group
| disabled | An optional boolean flag for disable the group
| run | A parameter controlling if the checks in the group should be run. Values all, scheduled or none.
| alerts | An array of alerts associated with the checks in the group
| alerts[...].type | the alert type (see API documentation)
| alerts[...].name | name of the alerts
| alerts[...].severity | the severities trigger the alert. Example IEWF
| tags | See example below for the format
| locationMatch | Matching string to find the location.
| browserChecks | JSON objects describing the full browser checks in the group
| commandChecks | JSON objects describing the command checks in the group
| urlChecks | JSON objects describing the URL V2 checks in the group
| commandChecks | JSON objects describing the Zebra Tester checks in the group
Generic properties for all checks types
Most of generic properties are merged with the same properties on individual checks. You can override them on check level.
| Property | Description
| ---| ----
| name | A descriptive name of the group
| disabled | An optional boolean flag for disable the group
| run | A parameter controlling if the checks in the group should be run. Values all, scheduled or none
| alerts | An array of alerts associated with the checks in the group
| alerts[...].type | the alert type (see API documentation)
| alerts[...].name | name of the alerts
| alerts[...].severity | the severities trigger the alert. Example IEWF
| tags | See example below for the format
| locationMatch | Matching string to find the location
| config | Check configuration properties. A combination of generic and check type specific properties. See API documentation
| id | The check id. This property is updated automatically when the check is found. Should not be entered manually
Command check specific properties
Command checks have specific arguments properties which are specific per individual check type. You must execute the API call checks/command-v2/categories to get information about all parameters per check type. See the API Documentation for the specific call.
| Property | Description
| ---| ---
| arguments | An array of value pairs with name and value properties
Browser check specific properties
| Property | Description
| ---| ---
| minVersion | The minimal version of the browser
| maxVersion | The maximal version of the browser. This property is optional (not used)
Schema location
https://files-apicasystem-com.s3-eu-west-1.amazonaws.com/schemas/asm-auto-deploy-schema-v1-0.json
Deployment descriptor example
{
"$schema": "https://files-apicasystem-com.s3-eu-west-1.amazonaws.com/schemas/asm-auto-deploy-schema-v1-0.json",
"name": "Dev_Deployment",
"description": "A dev deployment",
"asmApiUrl": "https://api-wpm2.apicasystem.com/v3",
"asmUIUrl": "https://wpm.apicasystem.com",
"authTicket": "EC0210F5-71E1-4BAD-A8DD-XXXXXX",
"zebraScenarios": {
"compileOptions": {
"ContentTest": "heuristic",
"PrxVersion": "V7.0-B",
"NeedContentFilter": true
},
"groups": [
{
"name": "Group1",
"disabled": false,
"compileOptions": {
"NeedErrorFilter": true
},
"directory":"zebraScenarios",
"scenarios": [
{
"name": "TM_Event_v1",
"type": "zip",
"plugins": [
"WebPerformanceValue.java"
],
"files": [
"events.csv"
],
"subDirectory": "TM_Event_v1"
}
]
},
{
"name": "Default",
"disabled": false,
"compileOptions": {
"PageThinkTime": 1,
"PageThinkTimeVariance": 10,
"ExternalXmlAndSoapRequestSize": 0,
"NeedErrorFilter": true
},
"scenarios": [
{
"name": "sp_new",
"type": "prxdat"
},
{
"name": "TM_BookingSession_v2",
"type": "zip",
"plugins": [
"AppDynamicsIntegrationV5.java"
]
}
]
}
]
},
"browserScenarios": [
"Google.side"
],
"checks": [
{
"name": "TM Development",
"disabled": false,
"run": "all",
"monitorGroups": [
"Auto Deployment Tool/Development"
],
"alerts": [
{
"type": "webhook",
"name": "ven03142",
"severity": "IEWF"
}
],
"tags": {
"Environment": [
"Dev"
],
"Application": [
"TicketMonster"
]
},
"locationMatch": "Frankfurt",
"zebraChecks": {
"config": {
"interval_seconds": 7200,
"max_attempts": 1,
"threshold_w": 15000,
"threshold_w_set_0": false,
"threshold_e": 25000,
"threshold_e_set_0": false
},
"checks": [
{
"config": {
"name": "TM Count Bookings - Dev",
"description": "Count current number of bookings in development environment.",
"scenario_filename": "tm_countbookings_v3.prxdat",
"additional_options": "-hostname ticketmonsterdev.apicasystem.com",
"threshold_lo_w": 300,
"threshold_lo_e": 5,
"threshold_w_set_0": true,
"threshold_e_set_0": true
},
"id": 57785
},
{
"config": {
"name": "TM Booking Session - AppD Integration -Dev",
"description": "TM Booking session with AppDynamics Integration plugin",
"scenario_filename": "TM_BookingSession_v2.zip",
"additional_options": " -DELETE_PERCENT 10 -HTTP_PROTOCOL http -PORT 80 -EVENT_ID random -HOST ticketmonsterdev.apicasystem.com -THINK_TIME 1 -vTriggerSnapshot true -vUrlMatcher .*"
},
"tags": {
"APM Integration": [
"AppDynamics"
]
},
"id": 57786
}
]
},
"urlChecks": {
"config": {
"request_method": "GET",
"threshold_w": 1000,
"threshold_e": 3000,
"interval_seconds": 3600,
"max_attempts": 1,
"custom_hdrs": {
"Auto-deployed": "true"
}
},
"locationMatch": "Ireland, Dublin",
"tags": {
"Logical Check Type": [
"Up-Time"
]
},
"checks": [
{
"config": {
"url": "http://ticketmonsterdev.apicasystem.com",
"request_method": "GET",
"name": "Wildfly JBoss - Dev",
"description": "App server admin url development.",
"content_pattern": "<title>Welcome to WildFly 9</title>",
"content_pattern_type": "literal",
"content_pattern_case_sensitive": true
},
"id": 57784
}
]
},
"commandChecks": {
"config": {
"threshold_w": 1000,
"threshold_e": 2000,
"interval_seconds": 7200
},
"locationMatch": "Stockholm, \\[amazon\\]",
"tags": {
"Logical Check Type": [
"Up-Time"
]
},
"checks": [
{
"config": {
"name": "Port TicketMonster Dev - Auto",
"description": "Ticket Monster port development."
},
"arguments": [
{
"name": "Host",
"value": "ticketmonsterdev.apicasystem.com"
},
{
"name": "Port",
"value": "80"
}
],
"category": "Diagnostic",
"name": "Port",
"id": 57777
}
]
},
"browserChecks": {
"config": {
"max_attempts": 1,
"threshold_w": 4000,
"threshold_e": 8000,
"interval_seconds": 7200,
"url": "http://ec2-52-57-166-83.eu-central-1.compute.amazonaws.com:8080"
},
"locationMatch": "Germany, Frankfurt",
"browserName": "chromium",
"minVersion": 74,
"tags": {
"Logical Check Type": [
"Front-End"
],
"Application": [
"Robotshop"
]
},
"checks": [
{
"config": {
"name": "RobotShop - Search Autodeploy",
"description": "Robot shop search product"
},
"scenario": "RobotShop_search_deploy",
"id": 57781
}
]
}
}
]
}
Integrations
Jenkins Integration
We can run asm-auto-deploy integrated with Jenkins. We have a template jenkinsfile which need some customization for working in a customer environment.
Installation
- Install asm-auto-deploy on the Jenkins agent.
- Copy the JenkinsFile to the github source repository.
- Create the pipeline in Jenkins.
JenkinsFile in source git Repository
The name here is JenkinsFile but you can use another name of the file.
import groovy.json.JsonSlurperClassic
import java.nio.file.Paths
def reportName = ''
pipeline {
agent any
parameters {
string(name: 'deploymentDescriptor', defaultValue: 'smallDeployment.json', description: 'The deployment descriptor file')
string(name: 'zebraHome', defaultValue: '/opt/zebratester/embedded/zebratester', description: 'Zebra Home directory')
string(name: 'workDir', defaultValue: '/home/jenkins/work', description: 'Work directory')
choice(name: 'logLevel', choices: ['debug', 'info', 'error'], description: 'Log level')
choice(name: 'modules', choices: ['all', 'checks', 'scenarios'], description: 'Deployment modules to run. All = default')
// booleanParam(name: 'run', defaultValue: true, description: 'Run checks')
string(name: 'lowLimitDeploy', defaultValue: '80', description: 'Low limit for % successful deployments')
string(name: 'lowLimitRun', defaultValue: '80', description: 'Low limit for % successful runs')
}
stages {
stage ('Check deployment descriptor') {
steps {
script {
def workspace="${WORKSPACE}"
def fileName= Paths.get( workspace,params.deploymentDescriptor).toString()
echo "Deployment Descriptor file = ${fileName}"
def json = readFile(file:fileName)
def data = new JsonSlurperClassic().parseText(json)
reportName = data.name
}
}
}
stage('Run ASM Auto Deploy') {
steps {
sh "asm-auto-deploy -f ${params.deploymentDescriptor} -s ${WORKSPACE} -w ${params.workDir} -l ${params.logLevel} -zh ${params.zebraHome} -m ${params.modules} -u -r"
}
}
stage('Get Check Results') {
steps {
script {
def returnValue
def lowLimitDeploy=params.lowLimitDeploy.toInteger();
def lowLimitRun=params.lowLimitRun.toInteger();
def reportFile= Paths.get( params.workDir,'rep_'+reportName+'.json').toString()
def json = readFile(file: reportFile)
def data = new JsonSlurperClassic().parseText(json)
echo "Percent of successful deployments=${data.successPercent} %. Low limit=${lowLimitDeploy}"
echo "Percent of successful check runs =${data.runSuccessPercent} %. Low limit=${lowLimitRun}"
echo "Number of deployed items =${data.deployedItems}"
echo "Number check runs =${data.checkRuns}"
if( data.successPercent < lowLimitDeploy || data.runSuccessPercent < lowLimitRun ) {
catchError(buildResult: 'FAILURE', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
}
}
}
}
Running the pipeline
The pipeline you create will not run without problems until you change the parameters and approve some Groovy classes.
See the Jenkins documentationabout Groovy script approvals.
Known bugs in this release
- Command checks. Target SLA is not set.
- Command checks. Can not update parameter values for Security category checks.
The known bugs are related to the API constraint. The API will be updated to support this issues.
The run scenario tool
The asm-run-scenario command line tool can run a browser scenario from the command line. Enable you to upload and test a scenario without creating a check.
command line options
Usage: asm-run-scenario [options] [scenarios...]
Options:
-V, --version output the version number
-t --authTicket <authTicket> ASM Api Ticket
-au --apiUrl <url> ASM Api Url
-bn --browserName <name> Browser name
-bv --browserVersion <version> Browser version
-u --url <url> url
-lm --locationMatch <location> Location match
-l, --logLevel <logLevel> Log level in log4js
-at, --apiTrace Trace API calls
-mt, --maxTries <tries> Max tries for job completion (default: "25")
-it, --idleTime <seconds> IdleTime between tries (s) (default: "12")
-h, --help display help for command
Example
asm-run-scenario -au https://api-wpm2.apicasystem.com/v3/ -t ********** -bn chromium -bv 74 -lm Frankfurt -u http://ec2-52-57-166-83.eu-central-1.compute.amazonaws.com:8080 RobotShop_search_deploy
[2021-09-29T10:02:41.679] [INFO] runScenario - Start running scenario [RobotShop_search_deploy] job
[2021-09-29T10:02:42.218] [INFO] runScenario - Waiting for [RobotShop_search_deploy] job to complete...
Browser scenario [RobotShop_search_deploy] job complete url=http://ec2-52-57-166-83.eu-central-1.compute.amazonaws.com:8080. Completed=true
{
in_process: false,
is_error: false,
error_step_id: null,
error_step_name: null,
error_text: null
}
If a specified scenario is path to a downloaded scenario, the scenario file will be uploaded to ASM and then executed. Example: source/RobotShop_search_deploy.html