Ever notice how dashboards never tell the whole story? Don’t you want to know what’s causing that spike in the chart?
Example:
Testing dashboard shows the number of test cases by status (passed/failed/not executed/etc.). They may even show the number of test cases by status per module/functionality for the application under test (AUT). That’s all good and all but I would wonder how many defects are tied to the test cases that didn’t have a status of passed. Wouldn’t you?
A wise man once told me that the devil is the details. He is right (at least I think so). If you can’t see the details, how do you know what needs to be made top priority? How do you know that one defect, even if originally marked as Minor should really be marked as Critical because it’s blocking a high number of test cases?
Before you create/publish your next dashboard/metric sit back and ask yourself this: Does this really show the details needed to give the full picture?
A dashboard should allow the user to not only see a rollup view of the AUT but also the details behind the numbers. Below are some examples of testing charts that may provide the big picture:
Do your dashboards/metrics tell you the whole story?
Trackback URL for this post:
https://www.teamqualitypro.com/software-metrics/the-devil-is-in-the-details/trackback/