I am rethinking my “categories of metrics” for content strategy and management after attending Joe Gelb’s and Lawrence Orin’s webinar “How to Become a Data-Driven Documentation Team” earlier this month.
(Please review my January 2020 blog, “The 5 Intersects of Content Strategy and Project Management [Part 5].”)
In summary, Gelb and Orin from Zoomin recommend that content metrics be carefully aligned with our organizations’ OKRs (objectives and key results) or KPIs (key performance indicators). Accordingly, we should look for metrics that are actionable, that is, data that assists us in measuring our objectives.
Note: Many thanks to the Content Wrangler for hosting the Feb. 12 webinar with Gelb and Orin.
Actionable vs Scorecard Data
The Zoomin folks led us through a categorization exercise to help us to categorize various website metrics as one of the following:
A scorecard metric is different from actionable data in that scorecard data really only tells us where we are on a scale or over a period of time. In other words, it is trend data. An example is the number of visitors to a web page.
Actionable metrics, in contrast, spur us to action and often suggest what that action could be. An example, from my last blog and also from the Zoomin folks is the “search terms with no results” metric.
Of course, our selection of a category for a metric depended on the objective we were attempting to measure. So for some, keeping a certain metric trending above a designated score could be a KPI. As could maintaining an upward trend (for example, percentage improvement) for a designated time period.
Categories of Metrics Revisited
So here is my re-categorization of the original metrics groups – from Adobe Analytics – that I described last month.
In the “Actionable” category I have refined some of the metric groups as a comparison, most often as a comparison against what the expectation would be. That expectation could be defined in the objective and could be revised based on discovery.
For example, if the objective was to archive and halt work on any PDFs that hadn’t been downloaded in three months, we might have to qualify that based on expectation that certain PDFs would be downloaded only quarterly. An example might be the CLI (command-line interface) reference manual, to which updates are made quarterly based on maintenance of the interface.
Basically, these “vs expectation” metrics are actionable when they fall below or exceed expectations.
Notice that the “Activity maps” metric falls into both categories. Depending on how we interpret the metric, it could be either scorecard or actionable data. If we want only a record of activity for a given time period, the metric offers us scorecard data. If we do a deep dive into the metric to examine the data for the link targets and any similar pages, we could formulate an action plan for either the source page, the target pages, or both.
In fact, a deep dive into any of the metrics listed here could potentially drive action. If we ask the question “Why?” often enough as we are digging down into the data, some kind of action or need for further research likely will reveal itself. The most helpful feature of the Adobe Analytics workspace is that it lets us stack and compare metrics to reveal more insights.
Many thanks to Gelb and Orin for helping me think more critically about the metrics that I have used. I invite anyone who has experience with actionable content data to share that experience by commenting on this page or on social media.