The passage of time is a fact of life that we can neither control nor even influence. No matter how hard we try, minutes, days, weeks and months pass by with remarkable regularity and uniformity.
Yet, despite this universal fact, the single ever-present dimension in any business report is time; how has this metric performed over the last six months? Is this other metric showing an increase or decrease compared to last year?
This is all very good, but, so what? The fact that anything has changed in the last month is almost certainly not due to the passage of time, and almost definitely to do with the things that happened during that time.
Now, I am not saying that we shouldn’t track key metrics over time, but, and this is important, we should only do so with a mindset of measuring the effect of all the influencing factors on that metric.
Using time-based metrics to prompt questions is a very limiting exercise which can only really prompt the question: “why has <metric> gone up/down?”. A useless question because at that point it’s too late, at the risk of stating the obvious: whatever has happened, has happened already.
The real question any actionable analytics should be asking is: has a metric performed as expected? In other words, irrespective of the time period, has what was meant to happen, happened?
Take the scenario: ‘Complaints were up 5% year-on-year this month for product X’. At first glance a sensible enough statement and obviously not a positive statistic. But what if I told you, that on average, complaints were up 40% across the whole range. How does it look now? What if I told you that this product is part of a new quality testing regime that started six months ago with the goal of reducing complaints by 10%? Is this result positive or negative now?
A few simple changes can transform limited use timelines into actionable analytics:
Set a baseline: what are you measuring against and have you adjusted for other factors?
Change your view of time: is the continual passage of time the important thing to measure, or rather, is it the performance of your metric since a particular event. In other words, is January important or is it day six from the launch.
And, consider abandoning time altogether: measure your metrics (or change in metrics) against each other over a fixed period of time to understand cause and effect.
My point here is the passage of time in and of itself is a constant and is not an indication that anything has changed or moved.
If you want to get real insight from your reporting, you need context and, above all, an understanding of why you’re measuring the metric in the first place.
Earlier this year, Marks & Spencer announced plans to create “the world’s first Data Academy in retail” to bolster their ongoing transformation, which is seeking to place digital at the heart of its business. The move by M&S is a progressive one and wise in a world where data is often referred to as the new oil.
Here at S4RB we pride ourselves on our consultancy-led approach to our software solutions and, rather than riding the ‘Business Intelligence’ tool bandwagon, we instead deliver dashboards.
In the retail industry, there are still a lot of people burying themselves in charts and analysis, which is why we work with customers to define the right dashboards that deliver real insights against chosen KPI. Which is where changing perception of time can drive real insight.