Why I’m passionate about content metrics

…and why return on investment (ROI) is not the best metric to use for measuring content value.

I’ve been ruminating on the topic of content value (more than I usually do) since Tom Johnson published his essay on technical communication value last month. I’ve studied measurement, experimentation, and statistical process control in applications other than writing and I have also seen them repeatedly and painfully applied to writing. The results have been, almost without exception, anywhere from disappointing to destructive. Until this morning, I haven’t been able to articulate why. Thankfully, a post in Medium by Alan Cooper provided the shift in perspective to help me bring everything into focus. I’m passionate about this because it’s the new view of tech comm that is needed for the 21st century.

Writing is a process, but it’s not a manufacturing process

Alan Cooper’s post, titled “ROI Does Not Apply,” describes how “ROI is an industrial age term, applicable to companies that manufacture things in factories.” Writing is not a manufacturing process. Granted, there are some manufacturing-like elements of the writing process as you get closer to publishing the content—and honestly, making that part of the process as commodity-like as possible has some benefits. But, writers add the most value to the content where the process is complicated and non-linear: towards the beginning, where concepts, notions, and ideas are brought together to become a serial string of words. That’s were ROI and other productivity measures are inappropriate—they don’t measure what’s important to adding value.
Continue reading “Why I’m passionate about content metrics”

Measuring Value for WriteTheDocs Boulder

Presentation for WriteTheDocs-Boulder
Link to presentation given to WriteTheDocs-Boulder

I had the pleasure of joining the Boulder/Denver WriteTheDocumentarians at their meetup in Denver, last month. I presented a short talk on what I had learned about measuring the value of content that is typically produced by technical writing, which started an enlightening conversation with the group.

I’ve linked the slides and provided a brief narrative to go with them, here. Unfortunately, you had to be there to enjoy the conversation that followed–a good reason to not miss these events!

Measuring content value is a process, not a destination

For some, the idea of measuring the value of technical writing requires a shift in thinking. Measuring the value that content provides is just one step in the process of setting and evaluating content goals (see also Design Thinking). Without getting too philosophical, the first realization to make is that

You can’t measure value until you define what is valuable.

Value, however, can take many shapes, and different people in your organization will likely define value differently, especially when it comes to content.
Continue reading “Measuring Value for WriteTheDocs Boulder”

Measuring your technical content – What about…?

If you’ve been following the preceding posts on measuring content, as the use-cases and customer journey paths start to become less funnel-shaped, this is about the point where whataboutism starts to occur.

In the post on measuring Tutorials, for example, I assert that “the customer’s goal in reading a tutorial is to accomplish something outside of the web,” making detecting and measuring their success difficult to do from within the topic. While the definition of a tutorial might make that seem like a pretty clear goal, that doesn’t make it immune to whataboutism.

Whataboutism can enter the discussion at this point in the form of “What about the people who come to the tutorial topic looking for a code sample to copy and paste? They don’t want to learn anything.” Or, “What about the executive who looks at the tutorial to see if it addresses a particular issue they care about?” Or, what about…  You get the idea. From what I’ve seen, it’s easiest for whataboutism to enter the discussion when the goals are broad and vague and the data supporting the goals and their subsequent measurement are scarce.

(Does that sound like a content project to you?)

So, what can you do about the “what about…” cases?

Continue reading “Measuring your technical content – What about…?”

Measuring your technical content – Part 3

In my recent posts, Measuring your technical content – Part 1 and Measuring your technical content – Part 2, I described some content goals and how those might be defined and measured for Introduction and Getting Started topics. In Part 1, the interaction was funnel shaped, while in Part 2, the funnel metaphor applied only in one of the two customer journeys through the page.

In this topic, I talk about Tutorials and How-To topics and how the funnel metaphor becomes less appropriate as customers goals move beyond the web experience.

It’s time to get creative (scientifically, of course).
Continue reading “Measuring your technical content – Part 3”

Measuring your technical content – Part 2

In my previous post, Measuring your technical content – Part 1, I described some content goals and how those might be defined and measured for an introduction topic. In this post, I look at Getting Started topics.

Getting Started topics

If Introduction topics bring people in by telling them how your product can help them, Getting Started topics are where you show them how that works. Readers who come here from the Introduction topic will want to see some credible evidence that backs up the claims made in the Introduction topic and these topics are where you can demonstrate that.

Technical readers will also use this as the entry point into the technology, so there are at least two customer journey paths intersecting here.

  • One path will come to a conclusion here, moving from the Introduction page to see the value and then the Getting Started topic to see how it works
  • Another path starts from the Getting Started page (already understanding the value proposition of the product) and moving deeper into the technology to apply it to their specific case.

Because at least one of the customer journeys through the Getting Started topics are less funnel-shaped than for the Introduction topics (some are almost inverted funnels), it’s important to start with the goals and required instrumentation before writing so that you can design your page to provide the information that the customer needs for their goals as well as the data you’ll need to evaluate the page (your goal).

So, in that case, what how might you measure such a topic’s success?

Continue reading “Measuring your technical content – Part 2”

Measuring your technical content – Part 1

This started out as a single post, but grew, and grew, and… Well, here’s the first installment.

After the last few posts, it would be easy to get the impression that I don’t like Google Analytics.

Not true.

I just don’t like when it’s treated like it’s the only tool that can collect data about a web site—especially a non-funnel (informational) web site.

In this collection of posts, I’ll look at my favorite topic, API documentation, and how you might analyze it in the context of the customers’ (readers’) journeys. This analysis doesn’t have to apply only to API documentation, because it’s based on customers’ goals, which are more universal and, if you look carefully, you might see a customer goal that matches some of your content.

So let’s start with the basic questions…

Continue reading “Measuring your technical content – Part 1”

Google Analytics just makes me sad

In my last post, I talk about how Google Analytics isn’t very helpful in providing meaningful data about technical or help content. It can’t answer questions like: Did it help the reader? Did they find it interesting? Could they accomplish their task/goal thanks to my content? What do readers really want? You know, simple questions like those.

While a little disappointing, that’s not what makes me sad.

What’s sad is that the charts on the dashboard have all the makings of dysfunctional communication. For example, the dashboard seems to tell me, “You’re not retaining readers over time.” But, it can’t, or it won’t, tell you why.

Awww, come on, gimme a hint?!

Continue reading “Google Analytics just makes me sad”

The answer is Google Analytics—what was the question?

Lately, I’ve seen a collection of blog posts about using Google Analytics for technical or, more generally, informational content that seem to use a formative research method, that goes something like, here’s the data you can collect, now let’s imagine the questions that it might answer. It’s not that this method is not valid, just that it’s one that is usually done at the beginning of a project and it doesn’t scale particularly well. What many technical communicators could use on a daily basis is summative data on how their content is doing on a day-to-day basis.

Can Google Analytics provide useful summative metrics? Sure, but very few that monitor what matters to the reader. Most of them fall into the vanity metrics category for informational content. The reason is that people don’t come to informational sites to read web pages or click links (which is what Google Analytics tracks). They come to accomplish a goal—a goal that is likely not found in your informational web site. Your site, if it is doing its job, is a means to another goal.

So, how can you measure whether a reader accomplished their goal? And, by measure, I mean observe directly and not infer (or imagine) reader behavior.

Continue reading “The answer is Google Analytics—what was the question?”

Measuring the value of technical writing

This topic has been discussed in technical writing circles for decades. It’s not so much as a discussion as it is folklore in the form of good advice that’s difficult to apply in practice and it’s been around for a long time—this 1995 article by Ginny Redish, “Adding value as a professional technical communicator[1] lists these ways to measure value:

  1. Outcome measures
  2. Ratings of customer satisfaction
  3. Projections (estimates) of value added
  4. General perceptions of the value of technical communicators’ work

I like how she starts with measuring the value added, because if you can’t measure it, then how do you know you have actually delivered it? Further, if you can’t measure it, how can you show improvement in that—(a) whatever it is, it’s now better and (b) it’s better because of a decision you implemented. That’s a trick question. You can’t. So, how can you measure this? Continue reading “Measuring the value of technical writing”