Developing software and driving off road

During my presentation at the API the Docs Chicago conference this week, I used an example from my time as a developer to describe a documentation cliff. I characterized it as driving along and encountering an obstacle, such as a muddy patch, that required extra effort to cross. Twenty-some years ago, I used to look for those situations to challenge my driving skills and my 1995 Land Rover Discovery, both of which were well suited to tackling them (however, the documentation of these events is still on paper and film, so I’m going to have to dig for some visual proof).

In the driving metaphor, to cross the challenge and continue on to the destination, I described employing the various tools on the vehicle, such as the four-wheel drive, “locking in the hubs” (a rather antiquated reference, these days), and pulling out the winch cable in addition to my driving skills. I was usually prepared (and often, over-prepared) for the challenges encountered in my off-road adventures, and if I wasn’t, the people I traveled with were. We usually arrived in tact.

I never realized the depth of this experience as a metaphor for my software development experience until I heard myself describe it in my presentation. While off-roading, not only was determining the destination a challenge (are we going to the right place?), but so was choosing the route, the tools, the pace, and the requisite skills to actually complete the trip. The parallels to my software development experience snapped into focus. The challenge of determining the right thing to do, the right way to do it, and the right tools to get the job done as a software developer became too clear to ignore.

In my presentation, I described this experience as what customers encounter when they use incomplete documentation. For example, when you see detailed on-boarding content (Hello World, tutorials, and such), it’s easy to get a sense that the company cares about your success with the API–after all, look at what they are doing to get you started. But, running into the documentation cliff, or mud bog to stick to the off-road driving metaphor, is what happens when a gap in the documentation abruptly halts your progress.

Continue reading “Developing software and driving off road”

The inverted funnel of API documentation

Click the image to download the PowerPoint slides, or download the PDF

I just got back from API the Docs, Chicago, 2019, where I had the pleasure of hearing from some dedicated practitioners in the field of API documentation talk to about 100 equally dedicated practitioners about API documentation. While I was there, I gave this presentation. The video of the presentation isn’t available, yet, so I have included a link to the slide deck and posted my edited notes below.


It was great to see so many people who shared a passion for making their customers successful through documentation. One of the aspects of technical writing that continues to motivate me is knowing that what I do is going to help a customer achieve and deliver their success.

My presentation was the last of the conference and, after hearing a lot of stories about every level of API documentation, I wanted to wrap things up by returning to the perspective of our customers’ and their journeys with our products.

Most recently, the customer journey has been more than an academic exercise to me, because, for the past several years, I‘ve been developing an information system for limited-resource clinics in Honduras. (

It’s been an incredible journey and one that has put me squarely in the customer’s seat with respect to API documentation. I have had use a lot of software documentation to bring the project it together and this experience has brought into focus a lot of the research I’ve studied and conducted on API documentation over the years.

Throughout my journey with the piClinic Console, my students and I have been studying users and usage to develop and improve the system. In just a few months, we will install the systems for their first field tests in Honduran clinics and we’ll know in a few months how we did. At the end of this summer, we’ll know more about our users and our system and our journey will continue.

That’s my latest journey.

I can say with complete confidence that software documentation played a critical role in helping get my project to and, ultimately, across the finish line. But, each of your customers has their own journey, and everyone at the conference was there to make it a successful one!

Continue reading “The inverted funnel of API documentation”

The documentation cliff

For the past couple of months, I’ve been refactoring the piClinic Console software to get it ready for this summer’s field tests. Along the way, I encountered something I’d seen before, but never really named, until recently.

The documentation cliff.

A documentation cliff is where you get used to a certain level of documentation quality and support as you embark on your customer journey to use a new API and then, somewhere along the way, you realize that level of support has disappeared. And, there you are, like Wile-E-Coyote, floating in air, looking back at the cliff and looking down at where you are about to fall in the next instant.

Just kidding. What really happens is that you realize that your earlier plans and schedule have just flown out the window and you need to refactor the remainder of your development plan. At the very least, it means you’re going to have some uncomfortable conversations with stakeholders. In the worst-case scenario, you might need to re-evaluate the product design (and then have some uncomfortable conversations).

Most recently, this happened to me while I was using Postman to build unit tests for the piClinic Console software. I don’t want this to sound like I don’t like Postman–quite, the contrary. I love it. But that just makes the fall from the cliff hurt that much more.

How I got to the cliff

In my case, the tool was easy to get started with, the examples and tutorials were great, the online information was helpful–all the things that made a very productive on-boarding experience. So, I on-boarded myself and integrated the product into my testing. In fact, I made it the centerpiece of my testing.

Continue reading “The documentation cliff”

If we could only test docs like we can test code

Postman logo
Postman logo

As I continue to catch up on delinquent and neglected tasks during the inter-semester break, I’ve started porting the software from last year’s piClinic Console to make it ready for prime time. I don’t want to have any prototype code in the software that I’ll be leaving in the clinics this coming summer!

So, module by module, I’m reviewing the code and tightening all the loose screws. To help me along the way, I’m developing automated tests, which is something I haven’t done for quite a while.

The good news about automated tests is they find bugs. The bad news is they find bugs (a lot of bugs, especially as I get things off the ground). The code, however, is noticeably more solid as a result of all this automated testing and I no longer have to wonder if the code can handle this case or that, because I’ll have a test for that!

With testing, I’m getting to know the joy that comes with making a change to the code and not breaking any of the previous tests and the excitement of having the new features work the first time!

 I’m also learning to live with the pain of troubleshooting a failed test. Anywhere during the test and development cycle a test could fail because:

  1.  The test is broken, which happens when I didn’t update a test to accommodate a change in the code.
  2. The code is broken, which happens (occasionally).
  3. The environment is broken. Some tests work only in a specific context like with a certain user account or after a previous test has passes.
  4. Cosmic rays. Sometimes they just fail.

The challenge in troubleshooting these test failures is picking the right option from the start to make sure you don’t break something that actually was working (but whatever was really broken is hiding that fact).

But, this is nothing new for developers (or testers). It is, however, completely foreign to a writer.

Here are some of the differences.

Continue reading “If we could only test docs like we can test code”

Recent good, could-be-better, and ugly documentation experiences

During the “break” between semesters, I try to catch up on the tasks I’ve deferred during the past semester and get ahead of the tasks I know will come up during the coming semester. In the process, I’ve had these encounters with documentation that I’ve characterized into these categories:

  • GOOD: documentation doing what it should—making me(the customer) successful.
  • COULD-BE-BETTER: documentation that is well-intentioned, but needs a tweak or two.
  • UGLY: documentation that gives me nightmares.

Here goes.

Good documentation

These are the stories of documentation that made me successful. Being successful makes me happy. Good documentation should help make the reader successful.

Ford Motor Company. F-150 owner’s manual

The windshield wipers on my almost 3 year-old truck are the ones that it came with from the factory— almost 3 years ago. Well past their expiration (and effectiveness) date. But, while I’ve changed car engines and gear boxes before, I hate changing wipers. They always have some clever (and obscure) trick to getting the old ones off and the new ones off. I especially hate it when they go flying off the car in a rain storm. So, for all these reasons,I’ve procrastinated on changing them for far too long (as driving in recent wet weather has reminded me)—the replacements have actually been in my garage since…I actually can’t remember.

Continue reading “Recent good, could-be-better, and ugly documentation experiences”

I just can’t see myself as a customer

How often have you been shopping and, while looking at something that might be suitable, you just had the feeling that it wasn’t right? When shopping for houses last year, we had that feeling many times (too many times). They were, for the most part, all nice homes, but they just didn’t do “it” for us. Sometimes we could articulate the reasons, other times, it was something less tangible. In every case, except for the last one, of course, we walked away without becoming a customer.

We finished house hunting about two years ago and I hadn’t given that feeling a second thought until I came across a blog post about making the Build vs. Buy Decision. Point 4 of that post, whether it is cheaper to build than to buy, got me thinking about one of the value propositions that technical communication frequently asserts–to increase product adoption. To tie it back to house hunting, let’s call it making it easy for potential customers to see themselves using your product and becoming a customer instead of just a prospect.

Try it, you’ll like it!

In API marketing, one of the methods recommended to promote your API to potential customers is to provide some easy way for potential customers to take a test drive. Options such as providing a sandbox, having an accessible Hello World application or example, and providing tutorials that provide solutions to common customer pain points. The idea is to get them into the driver’s seat and let your potential customers take it for a spin. The lower the barrier to entry, the easier it is for your potential customers to see if this is a product for them and to see themselves as satisfied customers.

Continue reading “I just can’t see myself as a customer”

What’s your story?

For a few years, I tried my hand at filmmaking. I was OK at it; however, it was during a time when the competition was amazing. OK, wasn’t good enough. Fortunately, I had a backup job and, after a couple of years of being OK at filmmaking, I returned to technical writing. I was surprised to find that after being a filmmaker for a while, my technical writing had, somehow, improved, even though I did absolutely zero technical writing as a filmmaker.

Here are some of the lessons I learned as a filmmaker that apparently had more value when applied to technical writing.

All that matters is what people see on the screen

As an independent filmmaker I watched many independent films. Those scars will be with me for the rest of my life (If you’ve watched indie films, you know what I mean. If you haven’t, you’ve been warned). Some of my films must have scarred others (sorry). I believe filmmaker Robert Rodriquez said in Rebel Without a Crew, that you needed to make several hundred films before you become a filmmaker. I would agree. I still had a few hundred more to go before I ran out of money.

Before watching others’ films, the Directors (or, more often, director/producer/writer/photographer/star) would give a brief commentary and, invariably, these commentaries included all the things that didn’t go as hoped or as planned—preparing us for what were about to experience (i.e. lowering our expectations). While of some interest to the filmmakers in the audience, this information was irrelevant to anyone else.

What matters to the audience (i.e. those who might actually pay to see/rent/download the film) is what is on the screen. Period.

Tech writing takeaway: the reader will judge the content for what they see and not what you would like them to see.

Continue reading “What’s your story?”

What were they thinking?!

Design reviews or critiques are a favorite pastime. While my righteous indignation of having to suffer a bad website can still evoke some pointed criticism, I’m trying to keep things in perspective.

I’ve talked about critiquing designs in the past, but here, I’m talking about those informal critiques of designs created or constructed by other people or organizations. You know, the ones that usually start with “WTF were they thinking!?!”

The objects of such informal critiques are legend. A search for design fails in Google returns 270,000,000 results to me (in 0.3 seconds, so it must be a popular search). I remember talking about some of these design fails in design classes, but I’ve been involved with designing, building, and shipping long enough to know these examples of design fails are team efforts.

Full disclosure, I still shout “WTF?!” from time to time in spite of the following introspection, but my hope is that reflecting on these points will help me view them with a little more perspective. I write this so that you might, as well.


Let’s start with that point when you’re using a website that is sporting the latest version of Gordian-knot navigation or TV remote control and you’re wondering aloud how anyone could design such a mess, let alone ship it for what seems like the sole purpose of causing you grief. Of course, YOU would NEVER commit such a crime against humanity!

Would you?

Have you? (Be honest.)

If you’ve shipped more than one or two documents, products, web-sites, or you name it, then in all likelihood you, too, have published or shipped something that someone, somewhere, at some time will say WTF?! when they use it. It’s almost impossible not to and here’s why.

Why do people ship bad designs?

Continue reading “What were they thinking?!”

Who writes docs on API docs?

Pie chart of authors of API documentation literature showing 160 male, 32 female, and 11 unknown authors
The sex of the authors in my API documentation bibliography.

As More docs on API docs described, it’s mostly academic authors writing about API docs, but only by a narrow margin. As I was reviewing the articles I found, my impression was that only men were writing about API documentation.

Because impressions can be mistaken, I went through the articles and reviewed all the authors to determine their gender, insofar as I could. Because this information is rarely reported with the articles, I had to do some digging.

Most of my determination of an author’s gender is based on the authors given name; however, where available, I used their portrait or other references to confirm. For the names that had a recognizable (to me) gender preference, I assigned them accordingly. For those I couldn’t easily tell, I searched for the author’s profile on Google Scholar, ResearchGate, and just plain-ole’ Google to find a social media page. However, after all that, I could not make a credible determination for a little over 5% of the authors so they show up as unknown in the pie chart.

Pie chart of authors of API documentation literature showing 160 male, 32 female, and 11 unknown authors
The sex of the authors in my API documentation bibliography.

It turns out that my bibliography of 112 articles has a total of 203 unique authors; at least 160 (79%) of which seem to be men, while at least 32 (16%) seem to be women. Or, seen another way,

1 author in 6 who has written an article on API documentation is a woman.

That ratio is roughly consistent with the population of software developers reported in surveys such as Women in Software Engineering: The Sobering Stats (1/6) and 2018 Women in Tech Report (1/7).

So, what does this say?

I’m not sure. It seems like a case of: hey, 1 in 6 authors of articles about API documentation is a woman, which is in the same proportion as the software development field. And, hey, only 1 in 6 authors is a woman, which is in the same proportion as the software development field.  Also noteworthy, the earliest article in my list was authored by a woman.

More docs on API docs

I’ve tabulated my entire collection of docs on API docs for your perusing pleasure in my API documentation bibliography. The documents for which I could find links to online copies have links. The rest have restricted access. Fortunately (and somewhat surprisingly) only 14 (12.5%) of the 112 articles I’ve found so far are restricted.

The articles are sorted by year published and title so the most recent publications will appear at the top. I’ll check in every once in a while to see what’s new in the field and update the list as necessary.

I’ve reviewed each of these to varying degrees of detail. Some are, admittedly, very detailed and read like a functional software specification (i.e. as dry as the Atacama), so in those, I read just enough to pull out the classification details.

I didn’t plan this (nor was I expecting it), but the author-affiliations of all these articles break down as:

  • 47.3% of the articles were written by academic authors
  • 42.0% of the articles were written by industry authors
  • 8.9% of the articles were written by a mix (collaboration?) of industry and academic authors

Continue reading “More docs on API docs”