Chrome, cURL and Docker to the rescue!

I recently was faced with the task to do a major library upgrade for the software I’m working on. Our system is clearly separated in a RESTfull backend API talking JSON and an AngularJS frontend. My primary concern was the backend API, because this is the thing that will be affected most by the library changes. A first glimpse on this shows that must code would break because of library major version upgrades (for example Hibernate 3.6.10 to 5.x, Spring 3.0.5 to 4.2.x). How am I going to make sure nothing breaks? Integration tests to the rescue! If the bar keeps green, the code must be clean…. Unfortunately, for some ‘reasons’ real integration tests we turned off in the past. For whatever reason. But now…. we’re stuck.

So, programming the tests for a full integration check would take weeks, if not months. However, our tester has a clear defined set of test scenario’s he walks through when testing for a release. I decided to investigate if we could record the API calls durings those scripts in some way and ‘replay’ them, such that when the systems changes and some tests conditions need to be changed, we don’t have to ‘re-code’ them al but merely ‘copy’ the changed API calls from the browser.

We end up with the following solution: We executed the test scenarios one after the other by hand, using Chrome’s Developer Tools to copy all XHR requests to our backend. The Developer Tools have the nice feature to ‘Copy as cURL’ requests, and we paste those linesin logically grouped text files. For example, the cURL command to the my blog (this site) is

curl 'http://www.jointeffort.nl/blog/' -H 'Pragma: no-cache' -H 'Accept-Encoding: gzip, deflate, sdch' -H 'Accept-Language: en-US,en;q=0.8,nl;q=0.6' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.73 Safari/537.36' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' -H 'Cache-Control: no-cache' -H 'Cookie: _ga=GA1.2.1343258608.1398536077; __utma=240562076.1343258608.1398536077.1449228725.1449266147.10; __utmc=240562076; __utmz=240562076.1448044332.4.3.utmcsr=jointeffort.nl|utmccn=(referral)|utmcmd=referral|utmcct=/' -H 'Connection: keep-alive' --compressed

We simply paste all those lines grouped by use case and test scenario.

Next, I wrote a tool to parse those lines one by one and executed them against our test server (later more on that). I added features like

  • – -runAfter someFile.js – which allows us to write JavaScript to validate the responses and extract information such as ID’s from it for later use.
  • – -pause 5 – to pause the execution of the tests, in order to give our full-text search engine the time to pick up the changes
  • – -binary-data @somefile.json – which allows us to write readable JSON documents in separate files
  • interpolation of variables in URL’s and posted JSON data like ${someId}, which gets interpolated with values of variables extracted from responses using the –runAfter scripts.

For the execution of the JavaScript validation rules I used javax.script.ScriptEngine functionality.

Next challenge was to be able to run those test repeatable against a known test set. I end up with the following: using Gradle, I

  • export the latest structure of our database (tables only)
  • export data from some reference tables only
  • build the latest version of our application
  • create several connected┬áDocker containers simulating our production environment
  • for each test set
    • launch fresh containers
    • execute the testscripts
    • drop containers

The only task left now is recording the scenario’s, but this seems to be no more then ‘Copy as cURL’….

Leave a Reply

Your email address will not be published. Required fields are marked *