Here are five amazing recommendations by Avinash Kaushik from a post about how to make Web analytics dashboards better by simplifying.
Dashboards are not reports. Don't data puke. Include insights. Include recommendations for actions. Include business impact.
NEVER leave data interpretation to the executives (let them opine on your recommendations for actions with benefit of their wisdom and awareness of business strategy).
When it comes to key performance indicators, segments and your recommendations make sure you cover the end-to-end acquisition, behavior and outcomes.
Context is everything. Great dashboards leverage targets, benchmarks and competitive intelligence to deliver context. (You'll see that in above examples.)
This will be controversial but let me say it anyway. The primary purpose of a dashboard is not to inform, and it is not to educate. The primary purpose is to drive action!
It's a long post but well worth reading. I also like these sentences:
Somewhere along the way we've lost our way. Dashboards are no longer thoughtfully processed analysis of data relevant to business goals with an included summary of recommended actions. They are data pukes. And data pukes are not dashboards. They are data pukes.
I'm not sure if I agree with this - "NEVER leave data interpretation to the executives".
An executive's title is just that - they interpret the current state of the business, and *execute* actions based on how they think that the state can be improved. If all a COO sees is carefully curated indicators (which the VPs and Directors have all massaged to ensure that their department comes out looking good), the COO doesn't really have much of a job to do. At a large manufacturing firm, the COO demanded that he see detailed reports of weekly order & production numbers at every facility, with no statistical manipulation, smoothing, or leaving out of "outliers". He understood the business deeply and knew what he was looking at, so he would know where to focus his attention, and his plant managers couldn't hide behind a dashboard.
I guess what I'm saying is - the only way a dashboard improves his life is if and only if the metrics are identified and developed in cooperation with him and him alone. Too often political realities in organizations give the job of designing the dashboard metrics to people with a conflict of interest...
Posted by: Nate | 09/16/2014 at 08:55 AM
I wonder about dashboards. I see them in books, and in software demos, but I've never had one that was useful and don't know of any executive who really uses one -- even in a company that writes software for them.
Are dashboards the pie charts of the 2010s?
Nate has a good comment, but I'm in the middle. The analyst should describe what the data means, based on their deep knowledge of that data. Data do not speak for themselves. The executive, based on a shallow but wider knowledge of the overall company situation, may or may not agree.
Posted by: ZBicyclist | 09/18/2014 at 11:56 PM
Nate: Good point. I suspect you're interpreting Avinash's point more expansively than he intended. I think he means the dashboard should be interpreted data, rather than just data. But since you brought the issue up, it is frequently the case that dashboards are used by managers to "manage" communications up to the level above. One challenge of setting up an analytics team is how to create conditions under which the analysts can provide a neutral interpretation of the numbers. A further challenge is that many CEOs have their own agendas too.
Zbicyclist: I think you are right. The more quantitative executives prefer spreadsheets or data tables -- which I wish they would use dashboard (i.e. graphs) instead. The typical use of dashboards that I have encountered is for diagnosis... at the aggregate level, one can't fully explain any trends so you end up ordering further analyses based on hypotheses of what might be happening.
Posted by: junkcharts | 09/19/2014 at 03:40 AM
Maybe I misunderstood Avinash's point, but I would find it hard (if not impossible and certainly very time-consuming) to provide "recommendations" to CXOs based on the analysis done on the dashboard. Recommendations require pretty good cause-effect thinking and CE reasoning requires a mixture of subject-matter expertise and good data (and analytics). Often Analyst are not positioned to have a sufficient level of expertise on the data (this is why we have managers). And, the level required to create a "strategic" dashboard is rarely (if ever!) detailed enough to demonstrate cause-effect or even correlation-effect.
Although I like the idea of "insights" and "recommendations" it boarders on on tampering without sufficient expertise/data. Especially with BIG metrics like revenue. Why did revenue go down last month in this fortune 50 business? That's a HUGE question to answer with just a strategic dashboard. I though the dashboard was supposed to point out where/when revenue drops and then the action is to investigate.
Posted by: Jordan | 09/22/2014 at 02:57 PM