DADI’s web services have been in development for over four years. It all started with DADI API, the RESTful backbone of the API-first, Create Once Publish Everywhere approach that was the founding thought behind the platform.
We realized very early on in the development process that DADI API would be playing a big part in many of our internal and external projects. To that end we had to ensure that any software we used in-house or published to NPM was well-tested and easy to maintain.
This post describes some of the ways we achieve this across our (now significantly larger!) development team.
During the first iteration of API development, when the team was super small, we had no real need to implement a coding style for the project. With only two developers, one generally followed the style of the other and the style in the codebase was kept largely the same. As more developers began to contribute to the functionality of API, it was obvious we needed to make some decisions about style.
err argument in a Node.js callback).
With no editor configuration files required, it’s incredibly easy to implement across a distributed team.
In the beginning I think we all had issues with a style being imposed on us, as every engineer has their own personal style which they’ve developed over many years of coding. Not having to type a semicolon at the end of a statement was the hardest part for most of us!
Using a StandardJS linter in our editing software - and adding a precommit step to double-check code that was being committed to the repository - has helped us develop faster and with fewer bugs. Being able to open any file for editing in the platform and read the code without having to adjust to a previous developer’s style is an enormous timesaver.
🔗Unit and acceptance testing
Automated testing of the code written for the platform has been integral to our development process. The first developers on DADI API added a test suite and a testing process right at the beginning of its development. That test suite has now grown to over 560 individual unit and acceptance tests and the process has been replicated across each of DADI’s web services.
It’s all very well having a set of unit and acceptance tests, but how do you know whether you’re testing the right things? To obtain statistics on how much of our code is covered by tests, we use Istanbul.
Istanbul produces a coverage report at the end of every test run which tells us the overall percentage of code that is covered by tests and even goes so far as highlighting - in a lovely shade of red - code that is not touched during a test run. We display the overall coverage percentage in a badge on the individual web service repositories - and whilst we’re not at 100% coverage, we’re working to improve the coverage every week.
In some cases if the coverage percentage shows a decrease when running the tests, we consider the build a failure. This helps to remind developers that any new or modified code must be covered by at least one test.
In addition to running the test suite on a local development machine, DADI’s web services are tested throughly by Travis CI. For every pull request submitted to a repository, Travis builds the application, runs the test suite and reports on the success of the build. With Slack integration and tools such as CC Menu, everyone on the team knows how a particular change has affected the codebase.
🔗Contributing to the codebase
The DADI engineering team has a number of standards for committing to the codebase, including branch naming conventions and commit message guidelines. Having conventions for these two common tasks makes code and pull-request reviews easier and allows us to automate some of the processes.
Commit messages follow the same guidelines set by the Angular project. Prefixing commit messages with the type of change makes our git history more readable and for some web services, in conjunction with Semantic Release, allows for new versions to be published to NPM with automatically generated release notes.
🔗Keeping an eye on dependencies
In any project ensuring that dependencies are stable and up-to-date can be a daunting task. At DADI we utilize the services of Snyk and Greenkeeper to help us manage updates to dependencies.
Snyk continually monitors our products for issues, checking each dependency against a vast database of vulnerabilities. For every vulnerability discovered, Snyk suggests actions that can be taken to protect against it, even going so far as submitting a pull request to the repository.
Greenkeeper provides a service that informs the development team of new versions of dependencies. New versions of our dependencies are often published to NPM in response to vulnerabilities discovered by Snyk in their dependencies - Greenkeeper makes managing this often-painful process a breeze, and like Snyk it will submit pull requests to the appropriate repository for review.
🔗It’s all about confidence
Whilst we haven’t always got it right the first time - there have been a couple of patch releases to NPM mere minutes after a minor version release, for example - the above development processes have evolved over time to where they are today. They are unobtrusive, reliable and an absolute necessity to give the engineering team and the many users of our products confidence that the software works as expected.
For more detailed information about any of these processes in our development lifecycle, don’t hesitate to get in touch!