Agile documentation in practice

The last  couple of posts seem to state the obvious. They are , basically, variations of know your reader/customer. So, how hard could it be?

In my experience, there’s never time to do all of what is suggested. As a professional writer, the “topics-to-write” list has always been longer than the time available to write them—often by an order of magnitude.

Exactly the situation Agile was designed for.

I would think that writers would flock to Agile, if it’ll solve the problem of “so many docs, yet so little time.” And, they might, if given the chance.

Yet, in an unscientific study of documentation planning methodologies, they all look like they are based on a plan-driven (a.k.a. waterfall) project management model:

Plan, Write, Revise, Edit, Publish

…as though publish was something final. In Agile documentation, that might describe the process to publish a single topic, but it doesn’t scale up. If we call this sequence write-a-topic, in the larger scale, the documentation process looks more like:

Plan, Write-a-topic, Review, Repeat

Stopping every so often to step back and look at the view (i.e. take care of more strategic planning).

What works for me

I try to organize my documentation as much like the software development process as possible, but it took some de-programming to break the waterfall method into which I’d been indoctrinated.

It works best when I can define my documentation projects as epics, sprints, stories.

An epic is a large-scale project or functionality that spans multiple sprints and consists of many stories. For example, produce the documentation for a new feature. While the developers design the feature, I’m working with them to design the documentation—keeping the Agile methodology in mind. I  won’t be able to write it all at once, so I work with the product owner and the development team to figure out what functionality will be available when (in sequence, if not by date), what topics the customer will need and in what order (they can’t read it all at once anymore than I can write it all at once).

With that information, I define the stories and prioritize them with the product owner such that if I have to stop at any point along the way (e.g. to start on the next project), I’ll know that the most valuable topics have been written and the rest can wait (or they really weren’t as valuable or important as I thought they were).

The sprint planning and reviews keep the priorities in focus, but that often means that the epics are not done linearly. One documentation epic was about 2 months of work, but took 4-5 months to complete because of intervening priorities. Fortunately, I designed the content such that it was always publishable (shippable, in Agile terms) and the overall content set just kept getting better as I wrote it. Eventually, it was finished—as was all the intervening, higher-value content.

As Hannibal of the A-Team would say, “I love it when a plan comes together.”

Agile documentation customers

I continue the reflection I started in my last post, reviewing Scott Ambler’s article from 2002, Can Documentation Be Agile?

To no surprise, the customer emerged as being a critical element in Agile documentation. The points in Scott’s article refer to the Customer as though the customer is singular. Users, the user, etc. are also used similarly. In most cases, however, (i.e. with any luck) there is rarely only one customer. So, how can we understand all those people?

Find the center of mass

In physics, large bodies are modeled around a center of gravity or center of mass. This notion facilitates the study of very large objects, planet- or galaxy-sized objects in astrophysics, by modeling them as a single point with the mass of the entire object.

As with planets and galaxies, customers can also be modeled in a similar fashion. Many customers can be (and are frequently) modeled as a few personas or market segments. This is common in fields that deal with large numbers of people, such as marketing and advertising. In these cases populations are described by what they have in common and treated as a single persona. Does that mean every person in a segment’s population is the same? Of course not. But a well designed persona or segment describes the common attributes of the group sufficiently for its intended use. Just like knowing the location of an object’s center of mass will let you understand its properties and behaviors when acted upon by external forces.

One drawback to this method, is that when you miscalculate an object’s center of mass, it won’t behave as you expect it should. Its behavior will be consistent with its actual center of mass (whether you know where that is, or not). Where I’ve seen personas and segments go awry in technical writing is, inevitably, when they were not applied or not defined well.

Base personas on research

Just as an object’s center of mass is found through measurement, a population’s or segment’s properties are also determined through measurements and analysis (a.k.a. research).

If you want to treat a population as a single entity, you must measure it precisely.

But measuring customers is not easy. So what’s a writer to do?

Iterate

Successive approximation is a method for improving and refining a measurement and is an important part of Agile. Measure, test, observe, refine, repeat (or pick your favorite method for this). With this approach, you can continuously improve your customer model such that you can treat the customer as a single entity. There are several ways to accomplish successive approximation methods, but the key factor is that you assume that you must evaluate and iterate and that these aspects are baked into every process.

So, one of the first steps toward having agile documentation is having the ability to iterate (cheaply and quickly) and the ability to measure and understand your customers.

But, we all know that. Now, to figure out how to make that part of the process.

Agile documentation practices

Scott Ambler started an interesting conversation about agile documentation on LinkedIn  a short while ago. I visited Scott’s site, to see what else he had to say on the subject and I came across his article from 2002, Can Documentation Be Agile?

In his article, he lists 12 points that can help documentation be agile.   Scott’s article outlines a view of writing that I have not seen in practice in the 14 years since he wrote it. In my careers as a developer and a writer, I’ve seen bits and pieces of his list in practice to see how they might work together. I’ve also seen many obstacles that keep them from becoming more widespread. So in that context, I reflected on his points to imagine what such a world might look like.

I think it starts with us, the Writers, who need to look at the job from a new perspective.

Writers are more than “the scribes”

First, writers are not just “wordsmiths.”  While we take pride in our ability to craft great content, that’s not where we add value. Our value comes from what we craft, not simply that we craft. Likewise, software developers don’t add value just because they code, they add value because of what they code.

So, how do we know what to craft?

Focus on the customer

His first point is where writing really starts and ends—the writer’s audience and customer. Writers need to know their audience first hand. They need to walk in their shoes. More importantly, they need to make (or be granted) the time and resources to do this. Every writer? Yes, I think so, at least to some degree. Why? The longer a writer goes without walking in the customer’s shoes, the more assumptions creep in and take hold. It starts off slowly, but eventually, the writer’s view of the customer gets out of sync with the real customer. Markets change. Customers change, Products change. It’s a full time job keeping them all in sync, but it’s a necessary one.

Keep it simple, but not too simple

Of course, where that fine line falls is the source of constant debate. But, why is that such a debatable topic? I refer back to the previous point. If you know your customers, you know what they want. If you don’t, you make assumptions, and so the discussion becomes one of assumptions and not data. Hence, the contention.

Another aspect of this which provides a challenge to a writer is when one topic needs a lot of content to be “simple enough,” while another requires very little. The former looks lavish and the latter seems incomplete—until you use the, of course. But I’ve seen this unevenness be the topic of contentious discussions. Why? Because, audience data to the contrary was lacking.

All the more reason for writers to get out and meet their customers.

Finding open-source projects that need documentation

writemeI’ve suggested in various venues that aspiring (and experienced) tech writers look into open-source projects to find projects they can use to build out their portfolio. In addition to making your portfolio a better place, working on civic-tech open-source projects has the extra advantage of helping to make the world a better place.

The follow-on question to this advice is, invariably, “how do I find projects that need documentation?”

Well from your lips to Twitter’s ears. I got this link to Code for America as a Christmas present in yesterday’s Twitter feed.

They describe how to query for civic-tech open-source projects that are looking for help. You can modify the search to specify keywords, such as documentation. Try it and see if you can find a project you could help.

Search for civic-tech open-source projects that need documentation

This is a live link, so the content will change, but as I write this, searching for documentation returned over 300 projects.

Maybe there’s some potential for a New Year’s resolution?

More Audience, Market, Product

In a recent post, I introduced the Audience, Market, and Product framework. Within this framework, it is possible to understand the characteristics of each that influence the documentation plan. During this investigation, you might learn more about each that can also help guide the product development and marketing, in fact, I would be surprised if you didn’t, but I’m going to constrain the scope of this analysis to documentation only.

The previous post described these components of the framework and here, I look at what you need to know about each one to design your documentation plan. In future posts, I’ll describe some of the ways to learn what you need to know about each component.

When reviewing the following points, keep in mind that these aspects can vary over time. For example, the documentation requirements will change as the audience becomes more familiar with the product or the product becomes more (or less) successful in the market. So, it’s important to consider these components as dynamic and not static.

Audience

In looking at the audience, you want to have a clear picture of who the readers are. How will they consume and apply information about the product? This is important to know in order to design the content in a way that they’ll find most useful and effective.

Market

In reviewing the market in which the product is offered, you want to understand the position of both the product and the company in the market. This is essential to develop the rhetorical approach of the documentation that will be most effective for the customer and for the company. For example, does the documentation need to help sell the product or would that tone be inappropriate and counterproductive?

Product

Finally, there’s the product and its aspects that influence documentation. Some of these aspects can be large and obvious, like key features and functionality, while others might appear to be subtle, yet still have considerable impact on the documentation. As with the other components, the ultimate goal of knowing this is to develop a clear picture of what the customer needs to be successful with the product and what the company needs to be successful.

What’s next

In the posts that follow, I’ll present some of the questions to ask in order to find out the what you need to know to design your documentation plan.

Videos in technical communication

WinDevVideoThe subject of videos frequently comes up in conversations about technical communication, even in when talking about API documentation. On the one hand, they can add some zing to your technical content (and what technical content can’t use a boost in the zing department). On the other hand, they can produce a negligible, or even negative, return on the cost they took to produce.

Video genres

To make this discussion more concrete, I consider these genres:

  • How-to videos
  • Can-do videos
  • Meet-me videos

How-to videos

These videos describe, and usually demonstrate, a task. Ideally, they describe the end-state (goal), beginning state (you and your problem), and the steps required to take the viewer from beginning to end.

The video on how to repair eyeglasses is still my favorite example of this genre.

Depending on the nature of the task, the viewers’ goals are often reading (viewing) to do now or a reading (viewing) to do a task later.

Can-do video

Many products, software and APIs included, have more features than meet the eye, or more applications than the viewer might realize. The line between promotional and educational can-do videos is fuzzy. While technically user education, they can seem promotional because they are promoting a capability of the product. The difference is in the tone and the call-to-action.

Until I produced a video about a Microsoft API, I didn’t think it was possible to make a compelling video about an API, especially one with no user interface. It turns out that it is possible. It just takes imagination (and a lot of effort).

The viewers’ goals for these videos is, invariably, reading (viewing) to do a task later, if only because they didn’t know they could do it before or during the video.

Meet-me videos

These are the videos in which a member of the product or development team provides a behind-the-scenes look of the product, its development, or its manufacture. When they provide information about the internal architecture and implementation that can be applied by the viewer, they can be interesting and educational.

There are many bad examples of this genre, which makes the stand out ones really stand out. The TED talk by Tony Fadell, is a great, meet-me video because in it, we learn more about him in a way that is very accessible (as in, you could do what I do, too).

The viewers’ goal for these videos is often reading (viewing) to learn, because, unlike how-to and can-do videos, it’s hard for the viewer to know, in advance, what the video will deliver in terms of knowledge or entertainment.

Which one to choose?

As always, it depends.  I think the Audience-product-market framework applies to videos as it does to any other type of content. If you understand what resonates with the audience and how to present that in the prevailing market, the best type of video for that situation should be clear. Understanding the readers’ goals, of course helps bring the answer into focus.

Audience, Market, Product

In a podcast-interview I did with Tom Johnson, I mentioned this framework as a way to evaluate technical documentation requirements. The components of audience, market, and product aren’t anything new, nor is considering them in documentation planning. What’s been missing, however, is an effective way to understand them in a way that informs documentation.

This framework is my latest iteration on how to apply the 12 cognitive dimensions of API usability to technical documentation. These dimensions, by themselves, are very difficult to apply for various reasons, but I think the notion of identifying the components and elements of an interaction can be useful—but the method must be usable. So, I’ve taken a step back from the level of detail the 12 dimensions to these three.

In this framework, it’s essential to consider, not just the documentation, but the entire customer experience in which the documentation will reside to correctly assess the requirements. I’m still thinking out loud because I think that there’s some value in lingering on the question(s) before diving into the solution process.

So to review the framework’s components…

Audience

These are the people who will (or who you expect to) read the content. Content includes anything you write for someone else to read. The boundaries of this depend on a lot of local variables, but should include all the content of the entire customer experience. Your audience might be segmented into groups such as  business/purchase decision makers, direct users, indirect users, support, development. You should know how they all interact with the entire customer experience.

Market

For this analysis, the market is the space in which your company or product is acquired. It could be an open-source product that offers a service or benefit similar to others. It could be downloaded from an app store. It could be something sold door-to-door. How your product appears in the space it shares with other similar products influences your content priorities. The more you know about the relationship between your product, its competitors, and its customers, the better you can assess those influences.

Product

Finally, there’s the product itself. How does it work? What does it do? How is it designed? What are its key features and benefits to the customer? What are its challenges? Knowing how the product’s features interact with the customer (i.e. audience) has a significant influence on the documentation.

And so…

And so, that’s where it begins. I’m still formulating the questions, and I think the questions are the key to bringing this down from a theoretical notion to something that can be applied by practitioners.

It all starts with knowledge (as opposed to assumption and conjecture) and that usually comes from research. With regard to research, I found these articles to be interesting:

Next, I’ll look at the questions that are specific to each component.

Best practice…for you?

Last week, I saw a post in LinkedIn about a “new” finding (from 2012) that “New research shows correlation between difficult to read fonts and content recall.” First, kudos for not confusing correlation and causation (although, the study was experimental and did prove a causal relationship), but the source article demonstrates an example of inappropriate generalization. To the point of this post, it also underscores the context-sensitive nature of content and how similar advice and best-practices should be tested in each specific context.

Hard to read, easy to recall?

The LinkedIn post refers to an article in the March 2012 issue of the Harvard Business Review. The HBR article starts out overgeneralizing by summarizing the finding of a small experiment as, “People recall what they’ve read better when it’s printed in smaller, less legible type.” This research was also picked up by Malcolm Gladwell’s David and Goliath, which has the effect of making it almost as true as the law of gravity.

Towards the end of the HBR article, the researcher tries to rein in the overgeneralizations by saying (emphasis mine), “Much of our research was done at a high-performing high school…It’s not clear how generalizable our findings are to low-performing schools or unmotivated students. …or perhaps people who are not even students? Again, kudos for trying. Further complicating the finding stated by the HBR article is that the study’s findings have not been reliably replicated in subsequent studies, other populations, or larger groups. I’m not discounting the researcher’s efforts, in fact, I agree with his observation that the conclusions don’t seem to be generalizable beyond the experiment’s scope.

Context is a high-order bit

All this reinforces the notion that when studying content and communication, context is a high-order bit1. As a high-order bit, ignoring it can have profound implications on the results. Any “best practice” or otherwise generalized advice should not be considered without including its contexts: the context in which it was derived and the context into which it will be applied.

This also reinforces the need to design content for testing–and to then test and analyze it.



1. In binary numbers, a high-order bit influences the result more than any and all of the other lower-order bits put together.

Studies show…

folding-map-360382_640In the quest to do more with less, one method I’ve seen used to get the job done more quickly is to rely on best practices and studies. It’s not that referring to best practices or studies is bad, but they should be the starting point for decisions, not the final word. Why?

Your context is unique

This was made obvious in my dissertation study in which the effect seen by applying best practices or not depended on what was being measured (i.e. what mattered). In the API reference topics I studied, whether using headings and design elements as suggested by the prevailing best practices or not made no difference in reading performance, but they made a significant difference in how the topics were perceived by the reader.

Those results applied to the context of my experiment and they might apply to other, similar contexts, but you’d have to test them to know for sure. Does it matter? You tell me. That, too, depends on the context.

A study showed…

A report on the effect that design variations had on a news site home page came out recently to show how a modern interface had better engagement than a more traditional, image+text interface. However, reader of the latter interface had better comprehension of the articles presented.

Since it relates design, comprehension, and engagement, I thought it was quite interesting. I skimmed the actual study, which seemed reasonable. I’m preparing myself, however, for what the provocative nature of the headline in the blog article is likely to produce.–the time when it will be used in the inevitable “studies show…” argument. It has all the components of great “studies show” ammo: it refers to modern design (timely), has mixed results (so you can quote the result that suits the argument), it has a catchy headline (so you don’t need to read the article or the report).

Remember, your context is unique

Starting from other studies and “best practices” is great. But, because your context is, invariably, unique, you’ll still need to test and validate.

When it comes to best practices and applying other studies, trust but verify.