The answer is Google Analytics—what was the question?

Lately, I’ve seen a collection of blog posts about using Google Analytics for technical or, more generally, informational content that seem to use a formative research method, that goes something like, here’s the data you can collect, now let’s imagine the questions that it might answer. It’s not that this method is not valid, just that it’s one that is usually done at the beginning of a project and it doesn’t scale particularly well. What many technical communicators could use on a daily basis is summative data on how their content is doing on a day-to-day basis.

Can Google Analytics provide useful summative metrics? Sure, but very few that monitor what matters to the reader. Most of them fall into the vanity metrics category for informational content. The reason is that people don’t come to informational sites to read web pages or click links (which is what Google Analytics tracks). They come to accomplish a goal—a goal that is likely not found in your informational web site. Your site, if it is doing its job, is a means to another goal.

So, how can you measure whether a reader accomplished their goal? And, by measure, I mean observe directly and not infer (or imagine) reader behavior.

Down the funnel

Let’s start with the questions that Google Analytics is good at answering.

Google Analytics is designed to track user experiences that are goal-oriented and happen entirely in the web site. The goal is typically to take a specific action in the website—usually something along the lines of: spend money. The purchase funnel interaction is much simpler than how readers interact with informational sites (which is not to say the funnel is simple). Consequently, Google Analytics’ utility analyzing information sites is limited to more qualitative, formative explorations than collecting ongoing and meaningful performance metrics. If you describe your “funnel” as ending up on any one of hundreds (or thousands) of pages, that’s not a funnel, at least not in the sense that Google has modeled its tool.

People don’t want drills, they want holes

Consider the purpose of a hand drill. Is it to spin a drill bit or to make a hole?1 Why does the customer buy one? Now, let’s attach Google Analytics for Power Tools to the hand drill. It might count revolutions, drill-bit changes, drill-bit diameters used, and number of trigger pulls. However, it won’t (can’t) count holes drilled correctly, injuries sustained by improper use, customer satisfaction with the experience, or any other experience that occurs outside of the drill. Which values do you think matter most to the drill user? Unfortunately, they are not the ones that Google Analytics for Power Tools can’t count.

While visitors to your site might open your pages and spend time with them open (which is all that Google Analytics can tell you they did), it is important to remember that they are not there to look at, or even read, your content. To the reader of an informational site, reading your site’s content is a means to a different goal. The reader’s goal is not to read your content or click links, but to do something else—something that probably has nothing to do with interacting with your site.

Readers’ experiences are what count

At least, they are what count to the user. If you want to know about the experience of your site’s readers, their experiences are what you should track.

The pages I read that talk about using Google Analytics for informational content (linked below) describe qualitative and exploratory use of Google Analytics. This is a good application of the tool (when used carefully), but it’s very time consuming and imprecise. Almost all of the advice is worded with lots of conditional language, which means that the analytics aren’t telling you what you really want to know. They might be hinting at what you want to know, but they aren’t sufficient to tell you for sure.

So why not just ask the reader? Did the user bounce because they accomplished their goal or not? Don’t guess based on average time-on-page values, ask them! Make it easy for them to answer. Don’t get greedy for information—remember, they don’t want to be there in the first place.

Twilio gets it

My latest favorite documentation example for this is Twilio’s API documentation. Their reference page has a Qualaroo pop-up that asks,

“Did the documentation on this page serve your needs?” Yes/No.

Rather than guess at why I have the page open in my browser as I type this, they’re asking me. Now they’ll know for sure. Even better, if I click No, they ask a few more questions (the last one resulting in a t-shirt offer), but I could stop answering anywhere along the way. No pressure.

Twilio’s method is similar to the tool I developed for my dissertation research project. It provides data that is much less ambiguous, data that can be aggregated in a meaningful way without magic (or tedium), and data that provides detailed comments from those readers who choose to provide them. What’s not to like?

I don’t know how their data collection is organized on the back end, but with the type of data I see it collect, I could relate it to other metrics and, for example, know how long readers “for whose needs the page served” viewed a page as compared to those whose needs weren’t served. (In my experiments, they were almost the same, by the way.) But, tantalizing queries aside, it can give me an ongoing, automatically-computed metric, of meeting readers’ needs.

One advantage of this type of feedback is that it can be tailored to the reader goal that you’re interested in by tailoring the prompt (question), depending on the page type and goal. For example, a topic that should help a reader accomplish a goal can ask if it actually did so. A page that is intended to inform, such as a conceptual topic, can ask if the reader understands the topic, now. And so on.

On the flip side, after looking at the price of the products on the Qualaroo site, I can see why this isn’t common. They are pretty expensive. But, I’m guessing you get more than a pop-up on the reference topic for the price. Most importantly, it shows that knowing what the customer thinks is valuable to Twilio.

Ask for the data you need

The bottom line is know what your content should do and then find out if it is doing it. If your content should inform, ask the reader if they were informed. If the content is to solve a problem, ask the reader if it solved the problem. Don’t take just what Google gives you for free if it is not going to help you answer your questions. Understand your audience and your content to track what’s important to you.

Sites referenced

Footnotes:

1. Holes/Drills notion attributed to Theodore Levitt in What Customers Want from Your Products

Leave a Reply