The pitfalls of performance measurement
We all thought we were doing a great job…how could this happen?
Measuring performance is pretty much a part of the workplace psyche these days, but do we take it for granted? Do we really understand why it’s important?
The DTA’s Performance Dashboard is all about measurement, but it is important that the right things are measured at the right time — that product teams identify early in the process what metrics will show them how well they have met their users needs.
This goes to the heart of why the DTA has developed the Performance Dashboard. It’s a central platform for all levels of government to share information and to facilitate improved transparency within agencies and to the general public.
It’s an opportunity for agencies to demonstrate how well they are meeting users’ needs and look for opportunities to continually improve performance. It also provides a visual platform for people to monitor performance so that mistakes like the bridges that don’t meet in the middle don’t happen. But this begs the question: how can problems like that eventuate?
The team behind the Dashboard have spent some time considering the pitfalls you should avoid if you want performance measurement to be effective.
The image of the bridge that didn’t meet in the middle is an engineer’s worst nightmare. The scary thing is that it can happen. Clearly if it were to occur you could surmise that something went horribly wrong. Performance against a reasonably obvious objective, ‘make sure the two ends of the bridge meet’, either wasn’t measured or wasn’t measured properly.
The same thing can happen in your workplace. Imagine the following scenario.
There’s a problem with your service. Your team needs to fix the problem, and it needs to happen yesterday. No time to waste, the team springs into action and fixes it. You all think it’s great. Well done team, we’re awesome! Except, it doesn’t make the user experience better, it might have actually made it worse.
What went wrong? A few basic things…
The team didn’t take the time to understand the problem in terms of the user need. They didn’t set clear goals about what they wanted to achieve. And, they didn’t develop any measures they could use to demonstrate whether they were achieving their goals.
There is already too much to do, and setting up performance measures early can be sent to the bottom of a long to do list, to often not get done. Not only is this a risky approach, it’s an approach that means the intrinsic value of knowing how well, or not, you are travelling is completely lost.
So, on the basis that we agree that performance measurement is a valuable tool for improvement, why are teams unsuccessful in their attempts to utilise it?
There are some common pitfalls that need to be front and centre in your thinking when trying to identify performance metrics. Here’s our top 5 list (in no particular order) of things that need to be considered if performance measurement is going to be successful.
Be clear what the purpose of the performance metrics is. This will help foster a common understanding and avoid data being interpreted as something that it is not. This should extend to what the consequence of poor results against the metrics might be. Is it an opportunity to improve or reason to pivot?
Make sure the metrics reflect real goals that are trying to be achieved. Remember that they are supposed to be KEY Performance Indicators, not superfluous vanity metrics.
The metrics need to be objectively measured. It’s no good having the world’s greatest metrics if you haven’t built mechanisms to capture them. And as much as possible try to avoid metrics that require some kind of subjective calculation.
Your performance metrics should be a fundamental tool for improvement, not an added extra. If you don’t develop performance metrics early you miss out on the biggest benefits they can offer.
Keep testing to make sure you are building the right things. It’s easy to be lulled into a false sense of security knowing that you have metrics and you are measuring them. Stay user centered and keep iterating. Don’t be the person responsible for the two bridges that don’t meet!
James is the Engagement Lead for the Performance Dashboard at the Digital Transformation Agency.