It’s not perfect, but it is a start – different from business reporting which wants to report on the things that show improvements, how do we build a framework that shows how interactions directly benefit revenues.
UX performance measurement is slightly different in that with the dashboards you can monitor things which shows that things are heading in the right direction and if they fall or a cliff you could know before anyone else. These should be aligned to business objectives, dials and strategies.
What it should also show is how your experiences are tracking against some of these crucial numbers. If you make a change somewhere, how are you sure that you can attribute that number to the larger more meaningful ones that everyone else outside of your immediate team is looking at? Sometimes this raises the question as to whether there has been any change at all. A bit like reverse engineering the graph for ‘actual sales’ where the x axis does not start at zero…
If you are trying to measure user behaviour and user interaction and the effect that that has on the numbers that matter to the business, means nothing. Then surely these means that the team is focused on the wrong kind of work – which ultimately does not build a case for ongoing CX or UX maturity.
It doesn’t mean that these have to stay the same all of the time; over time pick other ones which might be more aligned to what dials or shifts that you are trying to target. But as a start you are starting to measure something.
The flexibility then is that these can then support hypothesis led design, potentially in an agile framework or be used in waterfall to show the impact that a design led team is having on the bottom line.
It’s not perfect but how do we know what impact we are having where in the journey, that also aligns to the analysis of user testing? And one that demonstrates that the design is fit for purpose over time?
