noths-liken
v0.5.7
Published
generate and compare screen-shots
Downloads
4
Readme
liken
generate and compare screen-shots
Liken is a series of regression testing tools. Together, these tools allow you to quickly add regression testing (by generating and comparing screen-shots) to any project.
Contents
Workflow
Using the following workflow, Liken will speed up dev times by reducing manual testing:
The Review Server
This server allows you to quickly see the changes and helps you save them if they were desireable.
Delta Comparison Example
Overlay Comparison Example
Setup
Install Liken + Dependencies
- install Cairo (a native graphics library)
brew update
brew install pkg-config
brew install cairo
npm i -D yargs opn
npm i -D noths-liken nightwatch selenium-standalone sauce-connect-launcher
- Nightwatch can run end-to-end to tests on any environment (e.g. locally or on pre-prod) and on any browser
- selenium-standalone is also needed to connect the local browser with NightWatch.
- sauce-connect-launcher is required if using Sauce-Labs (browserStack capability is baked into NightWatch)
Add Environment Variables
We highly recommend using SauceLabs or BrowserStack to do the testing. This will ensure all testing is completed on preset machines.
If you choose to test against your local dev environment, you should be aware the CI and other dev environments are likely to be different.
BrowserStack
Preferred due to better mobile device reliability
BROWSESTACK_ACCESS_KEY
BROWSESTACK_USERNAME
SauceLabs
SAUCE_ACCESS_KEY
SAUCE_USERNAME
Create a Regression Tests Folder
Now copy everything from /examples/ (you can just run mv node_modules/noths-like/example ./
)to your projects tests
directory then please read on to learn what each file does. Some will most likely need to be customised for your project needs.
- nightwatch.conf.js
- The
Before
function ensures that our web server and the SauceLabs connection are started before the tests begin. - The
After
function takes care to close all open servers (to prevent the system from hanging) once the tests have completed.
- The
- nightwatch.json
- Sets up the selenium connection to the browsers we wish to test with.
- nightwatchGlobals.js
- update config to allow sauceLabs to connect
- Custom NightWatch Commands
- regression-reviewer.js
- UI to quickly accept or reject any screen-shots which have been saved
- start_regression.js
- NodeJS script to pragmatically run any task before NightWatch i.e. start your app server
- footer.e2e.js
- An example ent-to-end test using nighwatch that takes a screen-shot in mobile and desktop view.
- Update this file to suite your page!
Add NPM Scripts
The script-tasks below allow the recommended workflow to be followed.
{
"scripts": {
"preregression:core": "selenium-standalone install --version=3.0.1",
"regression": "npm run regression:core -- --env chrome_win --skiptags mobile",
"regression:desktop": "npm run regression:core -- --env firefox_win,chrome_win,ie11,edge,safari_osx --skiptags fixture,mobile",
"regression:core": "NODE_ENV=production ./tests/start_regression.js --saucelabs=true -o ./tests/tests_output -c ./tests/config/nightwatch.conf.js --saveDir=new --compareDir=base",
"regression:review": "node ./tests/regression-reviewer.js"
}
}
preregression:core
: Ensure selenium is installed correctly before testing begins.regression
: Run your tests locally against a single desktop browserregression:desktop
: Run your tests against all important desktop browsersregression:mobile
: Run your tests against all important mobile browsersregression:core
: The core task used to feed the other tasks (not to be run by itself)regression:review
: The UI Server used to help review the tests results quickly.
By default the local dev server is http://localhost:8080
. You can change this in the nightwatch.conf.js.
What Next?
Add Regression Tests
browser.compareScreenshots({ selector, state, done })
Just like the footer.e2e.js you will need to ensure you have some NightWatch tests that compare sections of your page.
The compareScreenshots
command will take a screen-shot, trim and save the image then compare this new image to any previously saved base image. If they are the same, the test will pass.
Options
selector
: [String] A CSS selector (.my-component
) which the screen-shot will be trimmed to match.state
: [String] any short descriptor which will be used when saving the image.done
: [function] A callback if needed.
Run Regression Tests
npm run regression
Once the code has been updated, you'll be keen to find out if anything has been broken.
If the target
option is omitted or points to localhost
a tunnel to the testing service will automatically be created.
Because tunnels can become unsteady with large builds, we recommend you test against a single browser locally.
See Continuous Integration about setting up a reliable suite to test multiple devices.
Options
--target
: For the comparison screen-shots, this should be your local/development url--saveDir
: This should be a new directory, which should be ignored by Git, where the comparison screen-shots will be stored.--compareDir
: This should match the directory where your base screen-shots are saved.--env
: This should match a key under thetest_settings
object within nightwatch.json- This should point to SauceLabs or BrowserStack to reduce inconsistencies between machines.
Review Test Results
npm run regression:review
This server has been included to help you quickly spot any differences.
Once loaded you'll be able to see the base
screen-shots on the left, with the comparison screen-shots on the right.
In the middle is another image which highlights any changes between the two.
If the new screen-shot is incorrect - you've got some work to do!
If the change is correct, click Accept new Screen-shot
.
Continuous Integration
Once the tests have passed locally we recommend that the code is pushed ready for CI to continue testing on a more devices.
To do this reliably, it is best not to rely on a tunnel being created (as we did locally) between the test machine and the testing service.
If you can get a new app to spin up with the new code, we can use that new URL to test against.
Example CircleCI config:
machine:
timezone:
Europe/London
node:
version: v6.2
test:
override:
- npm run test:unit -- --reporter mocha-junit-reporter:
environment:
MOCHA_FILE: $CIRCLE_TEST_REPORTS/junit/test-results.xml
- npm run lint
- "[[ ! -s \"$(git rev-parse --git-dir)/shallow\" ]] || git fetch --unshallow"
- git push [email protected]:your-app-preprod.git $CIRCLE_SHA1:refs/heads/master -f --no-verify
- npm run regression
- npm run regression:desktop -- --target=http://your-app-preprod.herokuapp.com
general:
artifacts:
- tests/regression/screenshots
Reccommendations
Keep the screenshots small. Full page screenshots will fail with every new feature. This will cause frustraition and lead to failing tests to be ignored. Cutting the page up into components will ensure that failing tests are meaningful.