Conducting a comprehensive content audit, also known as an information survey, can be a daunting task, especially if your organization has a large inventory of content. But is it really needed for every situation?
You would conduct a full content audit if your organization was undergoing a significant transformation, such as a digital transformation, or as part of a continuous improvement effort. But sometimes the moment calls for something more targeted, less exhaustive–something that enables you to quickly develop a plan of action.
I refer to this kind of content audit as a project-based content audit or a combined content audit and plan.
I developed this approach during my recent corporate gig as a content strategist when I had to modify the original content strategy I had developed for our internal product portal. I was awash in new JIRA epics and stories because the product was evolving to accommodate new user requirements and a new environment. And that evolution was occurring within an Agile development framework. So I needed a way to quickly interpret these rather feature-focused inputs into revisions for our user-focused content.
Content Audit Defined
According to Content Science, a content audit is an inventory of the content on a website or part of a website. It is, in other words, a catalog of your user-facing content – including videos, photos, and infographics.
Your goal in performing a full content audit is to identify content that needs improvement or transition. As Alina Petrova describes, “Regular, well-conducted content audits can help you identify content that needs to be improved, or can be repurposed into other formats.”
Thus a full content audit can include an evaluation of the metrics of each piece of content, using tools like Screaming Frog, Google Analytics, and/or BuzzSumo. If you have a sophisticated CCMS, the analysis might also include an examination of the reuse and versioning of a particular piece of content. Finally, your audit might include some kind of scorecard that evaluates the quality of the content, as suggested by the folks at Content Strategy, Inc.
Combined Content Audit and Plan
A project-level content audit or combined content audit and plan has a similar goal as a full content audit and includes some elements of this analysis. But it’s not as comprehensive. The purpose of your effort is not only to gain perspective but also to organize a plan. So you set boundaries for the audit part and focus the analysis part on what is really needed to deliver useful content to the target audience within the confines of the project.
A combined content audit and plan accomplishes four objectives:
- Identifies existing content for the user tasks associated with the current sprint or project
- Determines an action for that content
- Identifies new content to fill gaps
- Prioritizes the content work to accommodate the project schedule
Analysis occurs with each of the last three objectives and requires you to match your understanding of the project needs with your understanding of the users’ needs. The effort must also consider the best format and channels to deliver the high-priority content in a timely manner, including video and graphics.
Inputs and Analysis
To start your analysis, examine the specifications, user stories, and other project artifacts created by the project management team and the product development leaders for a particular area of the product’s evolution. Even in our Agile world, these artifacts can be feature-centric or even developer-centric rather than user-centric. So start there.
Most of your energy likely will be spent interpreting these artifacts and determining how much of the product evolution really impacts the user:
- Which users are impacted?
- How have their tasks changed?
- What new tasks do the users now have to perform?
Two additional inputs help in this analysis. Look at any new user research that’s been gathered as well as your team’s set of user artifacts – user personas and user journeys. Have a clear picture of each user’s interaction with a particular area of the product’s evolution. List the general tasks for each area as a memory check if you have to. Make each area of the evolution a focus area for your content plan.
Note that your knowledge of the product and your relationships with the product development and project management teams can go a long way toward making your analysis easier.
Next, conduct a set of searches of your content repository or database to create your list of existing content for each focus area. Use your keyword list or taxonomy to ensure that you are gathering all of the relevant existing content.
In my corporate gig, I worked with our information architect to conduct keyword searches of our CCMS for each of my focus areas. Each keyword search would yield a set of content/topic GUIDs with their titles and the portal tab on which each appeared. For example, for the focus area of automated product updates, I ran keyword searches on version, firmware, software, and update (among others).
For each focus area, I had to consolidate the content lists to eliminate duplicates – such as when two keyword searches yielded the same GUID or when content was reused on different tabs. I also did an ad hoc inventory of relevant content in mixed topic types, such as troubleshooting topics and training videos. I then added that inventory to each of my focus area lists.
To start my combined content audit and plan, I turned each of these revised lists into a spreadsheet – one sheet per focus area. I recommend spreadsheets because they allow the list to be sorted in any way that suits the content developer – by tab, by title, by GUID, and so on.
Into each spreadsheet, I then added columns to help with the plan. For example, since we used DITA, I added a column for Content Type. This helped with prioritizing the plan because I knew we’d want to focus on task topics first.
Then I added columns to indicate the action to be taken for each topic or piece of content: use as is, modify, or remove. An “X” would go into the appropriate column.
Next, I added a Comment column to explain the action, if needed, and add a JIRA ticket number or other references.
Finally, I added a column to indicate priority. Note that, often, my decision about priority was related not only to my task-first approach but also to a quick metrics analysis. For example, I would give lower priority status to concept content that had fewer views, even though the accuracy of the content was important. This prioritization wasn’t cut in stone. If an information developer came up with a plan that would not only improve the accuracy but also the performance of the content, I was all ears.
See the following example for the focus area of automated product updates.
You don’t have to duplicate my exact setup in your own project-level efforts. The idea here is to combine your content audit efforts with some kind of analysis that directs action.
You’re probably wondering at this point, how I indicated the need for new content to fill gaps not covered by the modification of existing content. To be honest, I trusted our content developers. I would provide some guidance about content gaps in the content JIRA ticket to which I attached the spreadsheet. (Yes, we used JIRA for content planning; more on that in another blog.) But I would leave up to them where best to fill in the gap, often discussing options with them during a one-on-one meeting at the start of their development effort.
In my example for the focus area of automated product updates, I wrote in the JIRA ticket: “A new task topic might be required to explain how to get or reset the XYZ card lockdown status. This new procedure could stand on its own or be added as a common procedure step before verification of component status during certain maintenance procedures. Expect some dependency on the completion of related interfaces.”
If you are writing notes like this one to your content developers, you should also determine how much of your own analysis to include. Generally, in the content JIRA ticket for a focus area, I provided a summary of the user impact, some links to the project artifacts, and the SME names so that the content developer could do further research. Obviously, I also provided those notes about content gaps.
Typically, management would assign one or two content developers to the content JIRA ticket.
The upshot is that even after I had done all of this work, the product could still evolve further as the product developers dug into their process. But the idea was to give the content developers a checklist of items to start with and some pointers to places that they should extend the plan with their own research.
Over time, I learned to simplify where I could, especially in those content JIRA tickets. But overall, I was glad to have come up with a process that enabled the team to focus their content development efforts within the Agile product development framework.
You can learn more about my team’s transformation journey by listening to the “Candid Conversations” presentation I recorded for the ConVEx 2020 conference.
You might also be interested in my companion blogs: