5. Audit

SOMETIMES, WE’LL ASK a conference audience, “How many of you know exactly what you have on your website, where it lives, and who owns it?”

Inevitably, even in rooms of several hundred people, only one or two people raise their hands.

This, friends, is a problem. To make even the most basic decisions about your content—like deciding where to focus your resources and budget—it’s good to know how much content you have, where it lives, what it’s about, and whether it’s any good.

And to know these things, you need to do a content audit.

Seeing is Believing

A web content audit is an accounting of the content your organization currently has online. More often than not, when you’re finished (or even midstream), the results are unbelievably valuable. As we said in the first chapter, an audit can be one of your most powerful tools when making a business case for any web content project.

When you finish this chapter, you’ll understand:

• Why audits are important

• What kinds of audits are most common

• How to record your audit findings

• How much content you need to evaluate

• How to share your results

Let’s get to it.

Thinking about Skipping this Chapter? Don’t

“We know the basic gist of what’s on our site.” “Somebody else in our organization must have done this before.” “I hate spreadsheets and don’t want to waste my valuable time on this.”

Here’s the deal. No matter how unnecessary or unpleasant an audit may look to you, don’t skip it. This process isn’t just about building up a nice spreadsheet of URLs and page titles. Audits can:

• Help you scope and budget for a content project

• Give you a clear understanding of what you have and where it lives, even if only to begin thinking about maintenance or content removal

• Serve as a reference for source (or existing) content during content development, making it a highly efficient tool for writers and other content creators to keep track of what they have to work with

Can’t Robots do this for Me?

At this point you may be thinking, “An audit sounds awfully time-consuming. Surely there are widgets that can audit my website automagically!”

The answer is yes—there are “audit tools” that can crawl sites and capture basic information, such as titles and links. Some CMSes have audit-like features, too. During the audit process, this kind of technology can be extremely helpful and, in some cases, necessary. But beware. Technology doesn’t replace the context provided by human review. If you really want an in-depth understanding of your content—substance, quality, accuracy—people power is the best way to go.


Carrie Hane Dennison works for a full-service web development firm called Balance Interactive in Springfield, Virginia. She says that her clients are starting to realize they need content strategy, even if they don’t know what it is.

Although clients aren’t always looking for a line item called “strategy,” Carrie finds ways to address content strategy. She prefers to do so early in a site redesign, but sometimes works with clients after content requirements are already in place. At either stage, using an inventory is one way Carrie helps her clients understand and address strategy. She explains to clients that, for every 5 hours they spend auditing near to the beginning of a project, they might save 20 hours at a later stage, preventing project delays.

After conducting an inventory of existing and needed content, Carrie asks her clients to spend time thinking about messaging, or considering the number of hours needed to complete the writing or content migration. She recalls one client who took one look at the inventory and said, “There’s no way I can get this all done.” Rather than setting up her own staff to fail, the client hired Carrie’s team to help migrate the content. The result was an on-time launch of the new website, rather than a month late.

That’s really the point of an audit: You can anticipate problems before they arise, and avoid derailing your project. All of that time and money may seem daunting at the beginning, but Carrie’s clients are impressed when the investment means they’re able to launch a better site, on schedule.


Technology can help you get:

Quick wins: When you have a very limited timeframe to build a business case or to prepare for an upcoming web project, technology can help. For example, if you want to understand the total volume of content on a website, an audit tool can give you a ballpark estimate in a hurry.

A head start: Auto-audit tools can save you tons of time by creating a complete list of all of your content, as well as some basic info about each content piece. Additionally, in sites without traditional navigation, CMS-driven tools may be the only way to get a complete list without going totally insane.

Neutral data: In organizations large and small, content discussions can get political. In this case, technologically generated, raw, undisputed data about your content can be your best friend.

Even with all of the technical shortcuts available, many content strategists prefer to do audits by hand. There is simply no better way to fully comprehend all of your existing content.

Common Types of Audits: Choose Your Own Adventure

Here’s the single, most important thing you need to know about audits: The kind of audit you do depends on what you want to learn. There is no one perfect format, size, or timing for an audit; there are many different (and totally valid) ways to audit your content. What you pick depends on your goals.

Here are a few of the most common content audits.

image

Which Audit is Right for you?

There’s no hard and fast rule, here. You can choose to do one of these audits, all three, or create your own audit format. No matter what you choose, doing a thorough audit will give you priceless information about your content.

Start by setting clear goals for your audit. These will help you determine which audit(s) you choose and what information they capture. Think about:

• What you want to learn (and why)

• What you need to prove (and to whom)

• How long you have to get the audit done (be realistic)

• Where you are in the content strategy process (if you’re not sure, see the next five chapters ...)

Let’s take a closer look at each of the audit types.

Quantitative Inventory: Just the Facts

The goal of a quantitative inventory is to learn what you have, where it lives, and a few other basic stats. No frills. Just objective facts.

A quantitative inventory is the quickest and easiest way to get some insight into your content at the beginning of a project. BUT. By simply cataloging the number of pages, downloadable PDFs, dynamic content modules, video clips, and other “live” web content for which your organization is responsible, you can wake up stakeholders to the magnitude of your content—and the budget you need to create/maintain/fix it. Cha-ching.

What to record

Here’s a list of the most common bits of data recorded in quantitative inventories. Title/Topics is a must-have. The other factors you choose depend on—you guessed it—your audit goals:

ID: Assign an identification number or code to each piece of content. (See page 57 for more information.)

Title/Topics: For a web page, this is likely the title of the page. For a content module, you may choose to use the heading or subhead. If there is no title of the content piece or page, include a short description of the key topics or themes covered.

URL: Record only where applicable.

Format: Make a note of the technical format of the content, such as text, video, PDF, etc.

Source: Specify whether the content is created in-house, by a content partner (newsfeeds, articles, blog posts, and so on), or by your users. Note: For content created by your internal team, if you can, note who creates, approves, and publishes each piece of content. This information can be enormously helpful when you begin to ask questions about why certain content was done a certain way, or when you want to confirm it’s okay to change or remove the content. We’ll examine this topic in detail in Chapter 9, People.

Technical home: If you’re dealing with a very large site that’s hosted on a number of different servers or platforms, take note of where the content lives within your technical infrastructure. For example, is the content in a content management system (CMS), inventory system, or fed to the site via an API? Sometimes content may be stored in very strange places, so be prepared to do some digging.

Metadata: Metadata is “data about data.” In this case, we’re talking about attributes (such as keywords and tags) assigned to each piece of content. These valuable data nuggets help people find content on search engines, on your site, and in your CMS. When planning findable, functional content during the strategy phase, you’ll need to know what metadata exists. If you don’t know where to find your metadata, your friendly IT colleague, web developer, or SEO consultant should be able to help.

Traffic/usage statistics: There are analytics available for almost any kind of online content. If it’s feasible, get the skinny on how people are using (or not using) each piece of content. An internal analytics person or a representative from your analytics provider can often help you get the information you need.

Last update: When was the last time somebody in your organization paid attention to this piece of content? Most CMS systems record a “last update” date—and that information can give you hints about the significance of the content, the content workflow, and more. Just don’t make any drastic assumptions—keep it in context.

Language: If you have content in multiple languages, you’ll want to record the language or dialect used on each piece of content.

Note: robots welcome here

While it’s always helpful to review your content personally, quantitative inventories are where the robots really shine. Because quantitative inventories are about collecting raw data (no human judgment required), the right technical tools can save you time and energy.

Qualitative Assessments: Deeper Dives

Seeing what content you have and where it lives is helpful, but only to a point.

Many a site map has been constructed based solely on page titles. But when it comes to qualifying the usefulness of content, a page title doesn’t tell you what the content actually says, or if it’s useful to your audience. That’s where a qualitative audit comes in.

A qualitative audit analyzes the quality and effectiveness of the content.

The key distinction between quantitative inventories and qualitative audits is human judgment. Qualitative audits are a robot-free zone. An actual human being has to look at each piece of content and evaluate it based on defined characteristics.

In our chart on page 50, we listed two kinds of qualitative audits. Both take the quantitative inventory and go a few steps further:

Best practices assessment: Usually done early in the project, a best practices assessment looks at your content from an outsider’s point of view. It measures your content against best practices and user needs. It helps you understand if your content is useful, usable, enjoyable, and persuasive to your audience—or what you need to do to make it so.

Strategic assessment: A strategic assessment is the most full-featured of all audits. Once you have a strategy in place, a strategic assessment gives you an idea of how your existing content aligns with it. Where are the gaps? What needs to change? What’s terrific as it is? A strategic assessment can combine factors from a best practices assessment with strategy-specific criteria. Note that this is often informed by analysis and recommendations—we talk about a lot of this in the next chapter.

Sample qualitative audit factors

In addition to the information gathered in a quantitative audit, there are dozens (if not hundreds) of possible subjective factors you can review during a qualitative audit. We generally choose 5–6 factors based on the situation. The table on page 55 shows a few of our favorites.

Create your own factors

As we mentioned, the table only includes a few sample audit factors. Feel free to be creative and think of your own. Just be sure you can:

• Evaluate the factor by looking at individual pieces of content (not groups or categories of content)

• Use the factor to assess most or all of the content (i.e., not just one type of content)

• Develop clear, specific guidelines for measurement (including ratings or categories)

The More (Auditors) the Merrier

If you’re auditing 1,500 pieces of content, one person can handle it. (Yes, really!) But if you’re looking at 10,000 pieces, you’re going to want some help.

When you share audit responsibilities, it’s absolutely imperative that your audit criteria, ratings, etc., are crystal clear. In addition to defining and communicating the criteria, you may want to:

• Have one person test the audit criteria before splitting up the work

• Create some examples of each criterion or rating

• Have regular check-ins with the audit team and spot-check each other’s work

• Make sure that in addition to ratings and pre-defined lists, auditors have a notes field to jot down anything out of the ordinary or explain their thinking

When it’s all over, you can commiserate about the agony of audit eyeball (where you can no longer look at the screen without seeing double).

QUALITATIVE AUDIT FACTORS

image
image

Audit Spreadsheets: Choose your Weapon

Back in the old days (like, 2005), auditing was so easy. Web content = website pages. The format of audit findings was always the same: a simple list of pages, ordered by navigation, in a spreadsheet.

Today, getting a handle on your content can be more complex. Content isn’t necessarily assigned to a single page on a website anymore—in fact, it might not be on a website at all. Even if it is, it might be displayed differently depending on the user’s behavior, preferences, or device (computer, phone, tablet, etc.).

Spreadsheets are still the go-to format for most audits, but the tools are evolving to accommodate new kinds of content. Let’s take a look at a few of the more popular options and how they work.

The Basic Spreadsheet: Old Faithful

If you have a traditional website—where content is assigned to a specific page within a fixed navigation scheme (usually a home page with lots of neatly organized pages underneath)—a basic spreadsheet is the tool for you. Here’s an example of a basic spreadsheet for a fake plant nursery website:

image

To audit your site, you simply click through every page of your site (usually in order) and record the information in an outline format. List major website sections as your top-level “parent” (or primary) sections. Then plug in pages and modules as “children” (or secondary, tertiary, and so on) sections or pages in each main section. Most of the time, the pages are numbered the same way you’d organize a document outline (1.0, 1.1, 1.1.1, and so on).

A note about ID numbers

If you don’t already have a numbering system for your web content, it’s a good idea to start one during the audit process. By assigning a unique ID to each page or component, you have an easy way to reference each piece of content, categorize content for analysis, and get an understanding of how pieces of content relate to each other. Lastly, a number system will help you link your audit findings to other web project documentation—the number of a specific piece of content can correspond to the content strategy recommendations for that content, etc.

Spreadsheet 2.0: When Content Flexes and Changes

As we mentioned above, today’s smarty-pants programmers have made it possible to customize website content based on who you are, your past behavior, or the device you’re using. (For example, people viewing a page on a mobile phone may only see half of the content available on a computer.)

When your site has these bells and whistles, variations need to be included in your audit. If your site has a set structure (where the navigation is basically the same for everybody), you’re in luck. You can still do the inventory in an outline manner with some small adjustments. Start by choosing one version of the content to be the root of your inventory (the primary user group, the most common device, etc.) Then amend your ID system and spreadsheet to indicate variations.

Back to our nursery example. Let’s say the nursery website has two audiences: the general public and professional landscapers. People who are logged in as professionals get expanded or different information. In this (very simple) example, we’ve added “:g” to the end of the ID numbers of pages targeted to the general public, and “:p” for versions of the pages for professionals.

image

In these types of audits, consistency is key. Regardless of the numbering system you select, make sure it’s used correctly throughout the audit.

Indexed Inventory: When Things Get Really Hairy

Now come the sites, apps, and content channels that are so incredibly flexible it seems like there is no navigation at all. Or, there really isn’t any navigation. Just tags. Or facets. Or something.

Whatever the situation, the content in question cannot be audited in an outline format. You can still use a spreadsheet for your audits, but you likely need to:

• Get a list of content pieces from the backend. It’s really hard to get a complete list of content pieces by clicking around. So, if you’re unfamiliar with the backend system, it’s time to make friends with an IT or CMS-focused colleague.

• Document (or find the documented) user characteristics or behaviors that cause the system to display each piece of content. Again, the backend people probably have this all worked out on a fancy model somewhere.

• Categorize the content into groups for analysis such as topic, product type, audience segment, or internal content owner. Create a meaningful numbering/indexing system based on your categorization. (If the CMS does this for you, too, hooray! But, often the CMS numbering system is too abstract for the purposes of an audit analysis. Boo.)

Our nursery audit might look like this:

image

Now this is a simple example; things often get significantly more complicated. Indexing systems get tricky, and, in some cases, it’s easier to create an audit database instead of a spreadsheet. Make the choice based on your audit goals, the people working on the audit, and the size of the mess you might be in.

Do you Really Need to Look at All of the Content?

It depends. If you have less than 5,000 pages/pieces of content, you should probably look at all of it. Yep. All of it. Why? Because you can: it’s humanly possible to do so.

But what happens if you have 25,000, 100,000, or 100,000,000 pieces of content? Or, you don’t have enough time to review 5,000 pages? When looking at every page is not an option, you have two choices: content sampling or rolling audits.

Content Sampling

One way to audit huge piles of content is to review a “representative sample” of your content.

Choosing your sample

The challenge with creating a sample is deciding what content should be included. Sure, you could just do a randomly generated selection of content items, but usually it’s better to make your sample more intentional by basing it on what you want to learn.

Brain Traffic’s Christine Anameier suggests considering the following criteria when selecting your audit:

Content objectives: If 70% of the site content is designed to increase sales and 25% is dedicated to customer support, your sample set can reflect those percentages. The remaining 5% (such as job postings or corporate philanthropy information) can be sampled lightly or not at all.

User groups: Divide sample content by user group—ensuring content for each of your major user segments is represented. Better yet, prioritize the user groups and sample more pages for the highest priority users.

Traffic: Site analytics can show you which pages or sections get the most visits and which get the least. Depending on your business goals, you may choose to focus on the high-traffic content, low-traffic content, or a combination of all traffic levels.

Content ownership: It may not be possible to include work from all teams of content contributors, but it’s helpful to get a good mix from people that regularly create your content. If the sample consists mostly of one group’s work, it may not reflect the content as a whole and other teams may not embrace the audit findings.

Update or maintenance frequency: Some content is maintained lovingly. Other content is left to go stale. If either of these two scenarios is over-represented in your sample, the results of the audit will be skewed—creating a false sense of pride or doom.

Depth: It’s tempting to audit only top-level pages of a website, but with today’s search tools, customers may never even see your top-level pages. The “deeper” content is often where your customers run into major problems. In his book, Killer Web Content, Gerry McGovern writes, “I come across many websites where there is a well-designed top level with quality content. However, when you click down a few levels, everything changes—it’s like walking out of a plush hotel straight into a rubbish dump.” So you may want to look at a cross-section from all content levels.

When choosing your sample, how much content is enough?

There’s no rule or benchmark for picking the perfect sample size. It would seem like the more content you could review, the better off you’d be. That’s somewhat true, but mostly you just have to look at enough content to see patterns emerge, answer your questions, or reduce uncertainty.

On a relatively small site (i.e., 10,000 pieces of content), you might need to look at half of the content before the patterns become obvious. For a million-page site, you might decide to look at only 0.01% of the content. That’s still 10,000 pages ... so you’re not exactly off the hook. But, you should begin to recognize some kind of valuable patterns. You won’t have the same level of certainty about your findings as you did with the smaller site audit, but you’ll have some ideas. And you probably aren’t going to learn anything else by auditing another 1,000 or 10,000 items—comparatively, the percentage of items reviewed is still so low that the change in the margin of error is microscopic.

Here’s a rough table of suggested sample sizes (based on common market research sampling practices):

image

If you can’t make these benchmarks or just want to ignore them, don’t sweat it. Adjust the sample size to your resources and time. Just about any sample will tell you something as long as you (and the people you show your results to) understand what content you reviewed and why.

Rolling Audits

Another effective way to audit large sites is a rolling audit—an audit that basically never ends. Lou Rosenfeld (Rosenfeld Media) says an audit or inventory “shouldn’t be something that you allocate the first two weeks of your redesign to; allocate 10 or 15 percent of your job to it instead.”

It works like this: In January, you audit one area of a website. In February, you audit a different area. In March, you move onto a third, and so on. Eventually, when all of the content of the site is audited, you start over with the first category again. (It doesn’t have to be monthly, either.)

The benefit of a rolling audit is that more content gets looked at, in a more careful manner, more often. This works best when stakeholders can agree to focus the first phases of the audit—and the content strategy—on a few discrete areas of the site.

And, guess what? On a super huge site, you can do a rolling sample of each area of the site instead of a rolling audit. It’s like a dream come true, really.

Tabulate Your the Results

When you’ve finished evaluating all of your content, stop and celebrate. Just bask in the glory of that completed spreadsheet. Have a cupcake! Take a nap!

Okay, that was fun. Back to work.

By this time, the audit team likely has a good idea of where your content shines and flounders. But, there’s nothing like cold, hard numbers to drive home the point. So, take the time to tabulate the results and look for patterns. You’ll be able to answer questions like:

• How much content do we have, exactly? Do we have more in some categories than we would have expected?

• Which areas of content score especially high or especially low on any specific factor?

• Do we have a disproportionate amount of content for one audience segment?

• How much of our content is out of date or inaccurate?

When you crunch the numbers, you often find some pretty insightful stuff.

Share Your Findings

At this point, you may be totally enamored with your spreadsheet and raw data—we are, too—but chances are your business stakeholders won’t share your devotion.

For them, you’ll need an audit report. Prepare the report in whatever format suits your audience: it can be a presentation or a full-scale document. You just need to convey the results of the audit and provide a reference for future discussions.

An audit report usually has three parts: an overview of the audit process, a path to access the raw data, and the findings report.

Overview of the Process

First, your report should provide a brief description of the audit process. This will help make your findings more understandable and believable. You might want to include:

Goals of the audit: Why you did the audit, what you hoped to learn, and what you want to do with the results.

Audit factors and measurement criteria: A brief overview of what was measured and how.

Scope: What areas of content were audited and why. If the content is particularly time-sensitive, you may need to include a date range as well.

Path to Raw Data

Some people who see your audit report are going to want to see the data. If you so choose, you can provide them with a link to the spreadsheet. But, don’t just let them loose with it. Provide a spreadsheet guide that explains how the spreadsheet works and alleviates any confusion.

Findings Report

Third, and most importantly, you’ll want to provide your audit findings. Your findings can include:

• A summary of overall conclusions and recommendations

• A description of each audit factor

• Data summaries per factor

• Factor-based themes or suggestions (with examples, when possible)

The report you create depends on the people who will see it. How much do they need to know? How much do they want to know? You may have different reports for different groups of people for the same audit. Here are some examples of what you might see in an audit report.

Formal detailed report

In this example, the auditor pulls out all the stops. This is a formal report made for an audience that craves details. In this example, the audit examined types of content formats used throughout the site. On this page of the report, you have:

1. A description of the audit factor

2. Graphic depiction of the results

3. Data cross-referencing site section and format

4. Key findings and analysis about the factor

image
Casual summary report

This report is more casual—possibly directed at a core team who is familiar with the audit and the content. It doesn’t go into the details, but gets the high-level messages across. This page includes:

1. Summary of the key finding for this factor

2. A bit of detail that clarifies problem areas

image
Presentation-style report

This slide is from a presentation that might summarize a report or may be the report itself. The slide headline summarizes the key finding, and the image provides some data for backup.

image

It’s very likely that conducting an audit will earn you a huge promotion, a ton more money, and a year’s vacation. Well, that might be a bit optimistic. At the very least, it will get people’s attention. And once you have people’s attention, you have the opportunity to present a business case for your next project or initiative.

But Wait, there’s More

With a completed audit under your belt, you should have a clear understanding of the content you have. It’s likely you have lots of ideas about what to do next. Why not jump right in?

Hold on there, chief.

There’s more juicy info to be had about user needs, competitors, and the content team within your organization.

Next stop? Analysis.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset