Identifying the work focus and/or the work plan during development also proves to be common ground for both the project manager and the content strategist.
- The project manager analyzes the requirements and the available resources and delivers a work breakdown structure or work package, whether that results in a set of stories for a two-week sprint or a full-blown Gantt chart.
- The content strategist performs a similar analysis, but must also consider how existing content and communication channels match with user wants and needs, as revealed by the audience and task analysis that she performed in the early stages of the project. The deliverables from this effort might also vary in form and format.
Note: This blog is the second in a five-part series that examines how the elements of the content strategist role both parallel and intersect those of the project manager’s role. See part 1 published in March 2019.
To help focus the work and the work plan, both roles focus on the future state and perform a gap analysis, particularly if the customer deliverable is a new version of an existing product. The primary tools that a content strategist will use to perform this gap analysis are a content inventory and a content audit. Both serve to help scope the work by answering the question “what is missing?” and both serve as a reference during development.
What is the difference?
Content Audit vs Content Inventory
Ann Rockley and Charles Cooper in their seminal book Managing Enterprise Content: A Unified Content Strategy distinguish a content inventory from a content audit by telling us that a content audit is the “process of actually looking at the content and assessing its value and opportunities for reuse.” In other words, a content audit is qualitative in nature.
On the other hand, the content inventory is, according to Kristina Halverson and Melissa Rauch more quantitative in that it provides “just the facts,” focusing more on listing content and its use and or location. (Their book Content Strategy for the Web is one of my favorites.)
Choosing one over the other might be driven by time and/or resources. CMS, LMS, and database tools can help in assembling a content inventory, and often the resulting list can be pulled into a spreadsheet. Then fact-based decisions can be made about what to keep and what to add to meet the parameters of the project and the audience needs. If customer deliverable and audience(s) are similar enough to a previous deliverable, perhaps the inventory and some straight-forward gap-filling are all that is needed.
I’ve leveraged such spreadsheets myself to accomplish such goals as planning web navigation and deciding where a new, small set of task topics best fit in the overall scheme of content deliverables.
Benefits of a Content Audit
A more qualitative approach to content analysis would have to be more specifically focused, but might yield more useful results. All of the experts I’ve encountered suggest taking a representative sample and then applying a course of assessments to what you’ve gathered: alignment with best practices, strategic fit, and/or reusability. (Content reuse is a big topic, and I bow to other experts in the field, such as the folks at CIDM, for their guidance in a later evolution of this blog.) Then apply the findings to the greater content set.
Note that assessing content for its alignment to best practices should include an agreed-upon rubric. How does your team describe quality content? You might want to start with Ahava Liebtag’s Step-by-Step Checklist, and adapt as needed – especially if your deliverables include more than text-based content. Another good resource is Sarah O’Keefe’s “hierarchy of content needs,” which she has described in a recent TechComm article (see the article Understanding Content Strategy as a Specialized Form of Management Consulting). Remember to consider the customer experience with the content, too.
Whichever path you take in a qualitative audit, I promise you the effort will be worthwhile – even quantifiably so! According to strategist Carrie Hane Dennison, as quoted by Halverson and Rauch,
Content audits can also lead to creative problem-solving. I’ve recently leveraged the results of a set of rolling content audits I performed to propose a new approach to a specific content type, setting our content (and my team) up to grow in a strategic direction while meeting the needs of a new audience.
Application to Development
The gap analysis – defined in terms of opportunities to meet new or evolving audience/user needs – helps the both project manager and the content strategist decide to do next. Fill the gaps, of course.
In the world of content development, especially when it engages sophisticated tools like a CMS and DITA, a content strategist and development team have lots to consider:
- What type of content is best to meet the audience’s needs? (DITA gives us the archetypes of task, concept, and reference).
- What level of detail is needed?
- How best can the content be conveyed to the audience (table, illustration, video, help file, FAQ, chatbot, etc.)?
- How should differences among audiences be accommodated?
- How should the content be categorized? And how should metadata applied?
- How does the content leverage or expand existing infrastructure? For example, an existing information model? An existing reuse strategy?
Not all of this has to be decided up front. Content development is as much discovery as development. But what should be decided up front is how the development team – content developers and product developers – work together.
How a content strategist applies the outcome of a content inventory or content audit will vary by project. Often, the breadth of the project and the level of engagement with product development – and with learning deliverables development – will influence that initial project planning piece. My team, in the midst of tackling content to support a major release and a new audience, has chosen, for example, to leverage a modified Kanban approach. We use an Agile-friendly tool that allows content strategy inputs and analysis to be recorded in “spike” stories within a larger “epic” that contains multiple work “stories” or issues.
Note that prioritization of content development can be part of this planning, too, but often evolves through later stages of development.
When planning for – and validating – a new (or expanded) delivery channel or content approach, my team also likes to engage in proofs of concept with incremental reviews, a topic I will delve more in the next blog in this series.