Measuring your technical content – Part 3

In my recent posts, Measuring your technical content – Part 1 and Measuring your technical content – Part 2, I described some content goals and how those might be defined and measured for Introduction and Getting Started topics. In Part 1, the interaction was funnel shaped, while in Part 2, the funnel metaphor applied only in one of the two customer journeys through the page.

In this topic, I talk about Tutorials and How-To topics and how the funnel metaphor becomes less appropriate as customers goals move beyond the web experience.

It’s time to get creative (scientifically, of course).
Continue reading “Measuring your technical content – Part 3”

Measuring your technical content – Part 2

In my previous post, Measuring your technical content – Part 1, I described some content goals and how those might be defined and measured for an introduction topic. I this post, I look at Getting Started topics.

Getting Started topics

If Introduction topics bring people in by telling them how your product can help them, Getting Started topics are where you show them how that works. Readers who come here from the Introduction topic will want to see some credible evidence that backs up the claims made in the Introduction topic and these topics are where you can demonstrate that.

Technical readers will also use this as the entry point into the technology, so there are at least two customer journey paths intersecting here.

  • One path will come to a conclusion here, moving from the Introduction page to see the value and then the Getting Started topic to see how it works
  • Another path starts from the Getting Started page (already understanding the value proposition of the product) and moving deeper into the technology to apply it to their specific case.

Because at least one of the customer journeys through the Getting Started topics are less funnel-shaped than for the Introduction topics (some are almost inverted funnels), it’s important to start with the goals and required instrumentation before writing so that you can design your page to provide the information that the customer needs for their goals as well as the data you’ll need to evaluate the page (your goal).

So, in that case, what how might you measure such a topic’s success?

Continue reading “Measuring your technical content – Part 2”

Measuring your technical content – Part 1

This started out as a single post, but grew, and grew, and… Well, here’s the first installment.

After the last few posts, it would be easy to get the impression that I don’t like Google Analytics.

Not true.

I just don’t like when it’s treated like it’s the only tool that can collect data about a web site—especially a non-funnel (informational) web site.

In this collection of posts, I’ll look at my favorite topic, API documentation, and how you might analyze it in the context of the customers’ (readers’) journeys. This analysis doesn’t have to apply only to API documentation, because it’s based on customers’ goals, which are more universal and, if you look carefully, you might see a customer goal that matches some of your content.

So let’s start with the basic questions…

Continue reading “Measuring your technical content – Part 1”

Google Analytics just makes me sad

In my last post, I talk about how Google Analytics isn’t very helpful in providing meaningful data about technical or help content. It can’t answer questions like: Did it help the reader? Did they find it interesting? Could they accomplish their task/goal thanks to my content? What do readers really want? You know, simple questions like those.

While a little disappointing, that’s not what makes me sad.

What’s sad is that the charts on the dashboard have all the makings of dysfunctional communication. For example, the dashboard seems to tell me, “You’re not retaining readers over time.” But, it can’t, or it won’t, tell you why.

Awww, come on, gimme a hint?!

Continue reading “Google Analytics just makes me sad”

The answer is Google Analytics—what was the question?

Lately, I’ve seen a collection of blog posts about using Google Analytics for technical or, more generally, informational content that seem to use a formative research method, that goes something like, here’s the data you can collect, now let’s imagine the questions that it might answer. It’s not that this method is not valid, just that it’s one that is usually done at the beginning of a project and it doesn’t scale particularly well. What many technical communicators could use on a daily basis is summative data on how their content is doing on a day-to-day basis.

Can Google Analytics provide useful summative metrics? Sure, but very few that monitor what matters to the reader. Most of them fall into the vanity metrics category for informational content. The reason is that people don’t come to informational sites to read web pages or click links (which is what Google Analytics tracks). They come to accomplish a goal—a goal that is likely not found in your informational web site. Your site, if it is doing its job, is a means to another goal.

So, how can you measure whether a reader accomplished their goal? And, by measure, I mean observe directly and not infer (or imagine) reader behavior.

Continue reading “The answer is Google Analytics—what was the question?”

Still buzzing about StackOverflow documentation

I don’t think I’ve ever seen a technical documentation project get so much attention (be it successful or not). The good news is at least people are talking.

But the talk is still looking at only symptoms. The conversations have been interesting but fall short of the root causes.

The documentation project kick-off post identified these problems with documentation (paraphrased here) that Documentation intended to fix:

  • Documentation is often an afterthought
  • Documentation is lacking in examples
  • Documentation is rarely complete

The kick-off described the problem, fundamentally, as a shortage of documentation and approached the solution by making it easier to add more documentation (articles and examples). Consider each of these points as discussed by some other technical writers. Continue reading “Still buzzing about StackOverflow documentation”

Measuring the value of technical writing

This topic has been discussed in technical writing circles for decades. It’s not so much as a discussion as it is folklore in the form of good advice that’s difficult to apply in practice and it’s been around for a long time—this 1995 article by Ginny Redish, “Adding value as a professional technical communicator[1] lists these ways to measure value:

  1. Outcome measures
  2. Ratings of customer satisfaction
  3. Projections (estimates) of value added
  4. General perceptions of the value of technical communicators’ work

I like how she starts with measuring the value added, because if you can’t measure it, then how do you know you have actually delivered it? Further, if you can’t measure it, how can you show improvement in that—(a) whatever it is, it’s now better and (b) it’s better because of a decision you implemented. That’s a trick question. You can’t. So, how can you measure this? Continue reading “Measuring the value of technical writing”

Learning from v1

Yesterday, I posted some thoughts on the announcement of closing down the StackOverflow Documentation Beta. I tweeted the first question I had after seeing this announcement:

And got a very gracious reply from Jon Ericson, the community manager who posted the announcement. I was impressed and, after reading some of Jon’s other posts related to this announcement, developed the utmost respect and admiration for his candid and open discussion of the project. That takes integrity and dedication at a level to which I can only hope to aspire. I’m feeling a little jealous, but in a motivational way.

Thanks to the transparency of the posts about the project, there is a lot for developers and technical writers to learn, here. Continue reading “Learning from v1”

It’s hard to write good technical docs

StackOverflow’s announcement to end their documentation beta was both disappointing and yet, not surprising. After a valiant attempt to tap into the wisdom of the crowd, StackOverflow discovered that good documentation is hard to write. As someone who’s read, written, and researched software documentation for a few decades (and with a dose of 20/20 hindsight), it was easy to see this coming in the assumptions they made or implied on their tour page and quoted here.

When we reviewed traditional documentation, two things were clear:

  • It had to be based on assumptions It was usually written once, often by someone not even using the technology, so it was a guess at what to focus on.
  • It didn’t prioritize good examples People learn best when they can see things demonstrated in actual code.

Let me break this down… (or you can just jump to my suggestions) Continue reading “It’s hard to write good technical docs”

Still catching up…

Photo of stacks of archived patient files
Archived paper patient files from one of the clinics I visited this summer.

It seems as though I’ve been “catching up” since I started as an assistant professor, and I’m not sure I’ve gotten any better, but I keep trying.

Summer break is just weeks from turning into the Fall semester where I’ll dive into my second year as an Assistant Professor. While I’m looking forward to it, I really need two more months of summer to catch up.

As I get used to academia, I’m finding the term Summer break to be a bit misleading. While it’s a break in the academic year, I’ve been too busy to consider it much of a break. Granted, I’ve been having fun, and it’s nice that work is fun.

Summer started off with a trip to Honduras, where I was able to conduct some site visits and contextual inquiries in how limited-resource health clinics manage their records. After two weeks of visiting five different clinics in Honduras, I came back with enough research to complete a conference paper, start a journal article, apply for a grant, and get an internship for one of our TC students…and that was all before the spring semester had officially ended.

After returning, of course, I had to actually produce the papers for which I’d collected the data and prepare for the conference presentation that I’ll be making next week with two professors from my department at the IEEE ProComm 2017 conference in Madison, WI.

One happy coincidence I had this summer was to meet some super technical writers from the Denver-Boulder (Colorado) area at a Write-the-Docs meetup. As luck would have it, their meeting and my travel managed to coincide last night in Broomfield, CO and I got to meet some of the tech writers that, until last night, only knew through email and the WTD forum.

But, busy is good. I’m looking forward to next week’s conference and, of course, the semester that starts just a few weeks after that.