Using Gamification with Sonarqube: the Story
Table of Contents
Some time ago I finished a gamification MOOC by Coursera and then I decided to put it into practice in my job since I found it really interesting. I set up a game in which the more Sonar issues you resolve, the more points you get, and you can rank in a leaderboard and obtain badges.
In this post, I’d like to share with you the results of this experiment, how people were the key to success, resolving many potential bugs and reducing our technical debt.
This game has evolved to become a product: Quboo. Check the official web page for more information.
Looking for the (business) goal
It started with a very simple idea: apply gamification to some task that at the same time is boring but it is something we have to do. It has to be boring or repetitive since I wanted to avoid the overjustification effect and apply gamification for something for which we should be intrinsically motivated, like creativity or innovation in the company.
I discarded also the day-to-day work resolving JIRA issues, given that in that case, it could have led to a kind of exploitation: definitely it’s not a good idea to have a ranking of employees by the number of tickets they resolve.
In the beginning, I thought there was no use case for us to apply gamification but then I suddenly noticed that there was a case in which all these conditions apply: fixing potential bugs and bad code practices in our legacy code by fixing SonarQube violations.
Modeling the code quality problem with sonar
Sonarqube (formerly Sonar) is an open source platform to manage code quality. It covers many languages, including Java, and it can be extended with plugins to add new rule engines. We use mainly rules coming from FindBugs, PMD, and CheckStyle. It’s able to detect a lot of bad code practices and reports also your unit test coverage and the overall complexity.
In our company, we use Sonar in an automated way inside our Continuous Integration framework. There is a job in Jenkins that uses Maven and its Sonar plugins to publish the results. We have different quality gates configured and if any important violation is added an email is sent to the developers.
A big part of our code base is more than 10 years old and it was produced without taking care of good practices, thus the called Legacy Code is a big issue for us. We inherited many Sonar violations and the authors of them are not in the project anymore.
The main idea was to encourage the team to fix the legacy code issues since they are the source of many bugs every now and then and make the code more difficult to maintain. New issues have to be fixed on a daily basis and they can’t be part of the game since someone could, in this case, introduce new issues on purpose and then fix them to win the game (it’s weird but it may happen). So I established the ‘Date of the Legacy’: any issue before that date that becomes fixed will give points to the assignee in Sonar.
To start with, the scoring system would be based only on Blocker and Critical issues. We had many of them from the beginning. So the worse the issue is, the more points you get by solving it. To keep it simple I based the score directly on the technical debt calculated by Sonar.
The last thing to consider was a cleaning of duplicated rules or controversial rules in Sonar. This would avoid big changes in scores and coherence in the game from the first day.
It is a simple PBL (Points + Badges + Leaderboard) approach. Fixing Sonar issues gives you points for ranking up on the leaderboard. Every now and then you get badges for doing fixes if you combine several of the same type or you fix many of them in a row. It’s a really simple game implementation so obviously it was impossible that would have worked without further explanation. I created a wiki page to explain it and provided the team with the main reason for playing the game: fun while achieving a business objective.
There is no winner or loser in the game, only a leaderboard with points and badges. Even though there is not a date to proclaim a winner recently we rewarded the team member with a higher score with a mug, with no prior notice. It’s not much, but small surprises are good.
Required setup & How to design the micro-web
Your CI framework has to be prepared to analyze your code base. That means Jenkins and maven with corresponding plugins have to send reports to Sonar server. You’ll also need an account for the micro-web game to access Sonar API.
For the micro-web, I used Spring Boot, the Sonarqube REST API, Thymeleaf, and Pure.css.
How users play the game
The premise is simple: while working you face code that is confusing or badly written that has many possibilities to end up being a bug. Instead of ignoring it and simply introduce your changes better:
- Cover the code with a Unit Test, if it isn’t yet. Assign to you in Sonar the corresponding ‘Insufficient code coverage…‘ issue if it does exist. That will give you points.
- Once you are sure you’re not introducing bugs you can refactor the code to make it easier to maintain, less coupled, etc. Go to Sonar and assign you the existing violations that you will fix. Once Sonar is executed again it will mark the issues as resolved/closed and you will score points for the game.
The game’s introduction
To respect the team, it’s important that everybody knows what is the game objective. The idea is to increase motivation for doing that, not manipulating people to do something. It’s an incentive to do something typically boring and if you can have fun while you do it that’s perfect.
It was clear from the very beginning that no reward was associated with the game apart from the intrinsic motivation of the competition between colleagues. Anyway, we gave recently a mug for the first classified as a surprise, and the guy really appreciated it.
The first month
Since the very beginning, some people got involved much than others. They were 5 out of 20 more or less the ones who started contributing. In this period many critical violations were quickly fixed, given that they were the ones that scored the most.
And then after a month, I sent an email with a summary and a snapshot of the leaderboard and then some others started to contribute. We are four teams so I introduced the ‘Team Ranking‘. That was powerful to get more people involved: by that time we reach 10 of 20 developers in the game. There were even some business analysts that started introducing comments in the code to fix some Sonar issues related to that, which is also a great contribution.
Eventually, the game was working!
The results after three months
Within three months many issues were solved and in general the motivation related to solving them increased notoriously.
Here you have some stats taken from Sonar. The game was introduced on March 27th. As you can see there is a clear decrement in all types of issues. Also, you will notice that during the first month all the blockers were removed and a big part of critical issues have been also reduced.
Moreover, inside the company, there are goals for the different groups and they soon revealed that we were clearly outstanding amongst other teams!
People are the key
There is something clear: the key to the success of this experiment has been the people. It’s a really collaborative team, people opened to any new idea, that understood perfectly what was the objective of the game and had fun with it.
I want to thank them publicly for the collaboration!