Tech comm education and the job market

One of the many informative experiences I had at this year’s WriteTheDocs 2018 conference was my chat with employers/recruiters at the Job Fair. My primary goal was to learn what they looked for in their technical communication interns and entry level recruits, so I could bring that back to school. What I learned was not surprising, having only recently left industry for academia, but it was a wake-up call.

Recruiters want people (even their interns) who have experience and they want current and relevant experience.

I agree, that’s a bit of a Dog bites man story. Of course they do. All other things being equal, having current and relevant experience is almost always better than not. I suspect it is because the assumption is that with it, the person will become a productive team member sooner than later. What I think is often overlooked is how other idiosyncratic aspects of the specific job might have a greater influence in the time to productivity, such as the onboarding support (discussed in Sarah Day’s talk at this year’s conference), but that’s not something that I, as an educator, have much control over (unless you’re hiring me to consult on the process, of course).

What an intern needs

One of the things I have to keep in mind when reviewing my notes is that I was at a software documentation conference and the companies I talked to were software companies with open-source software products to document. In that context, it was natural that they were looking for writers with experience in their product domains: open-source tool experience (Git, GitHub, Markdown, etc.) They also looked for a passion for documentation—as evidenced by experience in such tools (e.g. documenting an open-source project in GitHub). I don’t know how many of these observations transfer to looking for writers in other industries.

Continue reading “Tech comm education and the job market”

Write the Docs 2018

Had an amazing experience at the 2018 WriteTheDocs conference in Portland, OR, last week. This was the first time I’ve been able to attend the conference, although I’ve followed it for years. While you can watch the formal talks from this, and previous, conferences on YouTube, it really is nothing like attending in person. What you don’t get to experience from the excellent YouTube videos, that is literally tangible in person during the conference, is the supportive sense of camaraderie, support, and acceptance from the participants.

The talks were a collection of first-person testimonials and experience reports by experienced and practicing (well, except for me, perhaps) technical writers and managers. The talks all had a practical focus and tangible, practical takeaways that the audience could take with them and put to work on the job.

Bottom line, if you are a technical writer and can go to only one conference a year, I recommend this one quite highly.

The downside to the conference is ther are too many places to be at once. In addition to the formal talks in the ballroom, a parallel “unconference” and job fair took place simultaneously in a smaller room below the ballroom. To complicate matters further, there were all the people I knew from the WriteTheDocs Slack to meet in person.

Continue reading “Write the Docs 2018”

Collecting feedback about your documentation

KC-135R Engine Instruments showing various indications during testing
Lots o’ data

In my thread on user interactions with documentation, I suggested that you want your measurement instrument, usually some form of survey question(s), to influence the experience you’re trying to measure as little as possible. From a measurement standpoint, that’s nothing new. You never want your measurements to influence that which you’re measuring to prevent contaminating your measurement.

In the case of measuring feedback, or in this case, documentation or experience feedback, a recent tweet by Nate Silver (@NateSilver538) described how the use and perceived use of the measurement system had contaminated the data collected by Uber/Lyft (and countless other services).  He tweeted, “Given that an equilibrium has emerged where any rating lower than 5 stars means your Uber/Lyft driver was bad, they should probably just replace the 5-star scale with a simple thumbs-up/thumbs-down.

Continue reading “Collecting feedback about your documentation”

My GitHub is a mess and that’s a good thing

You’ll have to take my word for it (or find it on your own, I’m not trying to hide). I’ve seen where your GitHub account is treated as a developer’s code portfolio, and I think that’s a shame. I treat mine more like a code sandbox where I can go and play learn.

Having a portfolio is a good thing for those who want to show off such things; however, after having looked at a few open-source projects, I see more that are on the sandbox end of the spectrum than those on the portfolio end–by a 2:1 margin (I’d guess, I don’t count them).

One project that I’m in the middle of is definitely not anywhere near the portfolio end but it’s exactly where it should be. I’m using it to test out different ideas that, if all goes well, will inform the production version of the project, at which point, I’ll create a new repo that incorporates the best of the various experiments. Until then, it’s going to be where I do what I need to do to figure out what it is I want to do.

If you want to see how I’ve coded for money, you’ll have to look at examples in which I did exactly that. Unfortunately, I don’t have any copies of such work, except for maybe some old code examples I used in some documentation.

If you want to see what I’m tinkering with, feel free to check my GitHub Repo (after you find it).

From getting started to getting finished

As a technical writer who occasionally does some development on the side, I appreciate a fast Time to First Hello World (TTFHW)—the time it would take to create and run a simple program using a new (to me) API. The time to get started, in other words. The Nest API documentation I recently reviewed  reminded me of a much less frequently mentioned notion, the time to get finished.

If you’re not familiar with TTFHW, it is basically, how long it takes a new user to:

  1. Find your product
  2. Resonate with its value proposition
  3. Test it out to verify it provides them with a viable solution

The metric for this is operationalized by how long it takes someone to write a simple program to demonstrate step 3. From a measurement standpoint, this is a reasonable goal in that:

  1. It’s tangible and relatively unambiguous in its meaning (the reader was successful in accomplishing the goal).
  2. It can be measured accurately if your help documentation provides a way to link the documentation to the use of the API, through a user-specific key or interactive development environment.
  3. It can show a tangible engagement metric to show that your product is attracting and engaging potential customers.

While it’s good to use that as a metric for engagement and potential adoption, it’s important to not treat it as the only metric that counts because it doesn’t necessarily represent retention or other aspects where the money is made or lost.

Continue reading “From getting started to getting finished”

All is not gold that glitters, especially with API docs

I saw some buzz about the Nest developers’ API explorer and its corresponding developer documentation so I took a look. For many reasons, new API documentation formats attract a lot of buzz (and envy) among technical writing circles. From the technical writers wondering, “How can I make my documentation to look like this?” to the executives demanding, “Why don’t our docs look like this?!” Rarely, do I hear, “should our docs look like that?” But, that’s what I’m here for (it seems).

Should your API docs look like the Nest developers docs?

The answer is, of course, maybe.

I don’t have any inside information about why or how the writers of the Nest documentation did what they did, so the following analysis is based on my observation and inference.

What it does well

It has a very clean UI. Clean presentations inspire confidence and professionalism.  The styling is consistent and aligned with the consumer site’s branding, which implies they consider their API docs to be a part of their brand promise. I think that’s a good sign. Externally, it shows that the company considers the API to be part of the product’s brand and not a bolt-on or afterthought. As a developer, this is reassuring.

As first impressions go, it has a very nice curb appeal.

Continue reading “All is not gold that glitters, especially with API docs”

If your API is hard to document, be warned

Coal miner looking at canary in cageCoal miners used to bring canaries with them into the mines to detect toxic gasses commonly found in mines. Canaries would die when the gasses were present, but while still at levels not fatal to humans. They were an early-warning sign. The challenge is that you would have to keep an eye (or an ear) on them to heed their warning. If you were busily mining your coal (a rather noisy job to begin with), you could easily overlook the warning sign of a dead canary—much to your peril.

Think of your technical writer as a canary (that can also type).

I read a lot about how API documentation is deficient, how it should be automated, and how it should be made easier to produce and update. I’m all for making it easier to write and maintain, but I think replacing the canary with a device or process that won’t let you know when there’s trouble ahead is solving the wrong problem.

Why IS your API hard to document?

It could be that it’s toxic and you haven’t noticed the dead canary.

I read The Top 20 Reasons Startups Fail that studied the reasons that failed startups reported as why they failed and the reasons looked surprisingly similar to challenges I’ve seen in my career of documenting and researching APIs.

Coincidence? Let’s take a look.

There’s no market need

This is the #1 cause for failure cited by the startups reviewed in the CB Insights study.  What does this have to do with an API?

If there’s no market need, what user cases will the documentation address? To whom will the technical writer be explaining all these features? With what burning customer pain-points or unmet needs will the documentation connect? What sort of demos and tutorials will need to be written (or referred to) that demonstrate solutions to customer problems? Why will anyone care?

Continue reading “If your API is hard to document, be warned”

The right tool to measure your content’s performance

The past few days, I read a couple of articles on content metrics from the blogosphere: one had promise but ultimately indulged in some analytic slight-of-hand, while another actually made me smile and its focus on an solid methodology gave me hope.

Why is a solid methodology important? It’s the basis of your reputation and credibility. It’s the difference between knowing and guessing. These two articles reflect two examples of this.

First, the one that had a sound approach, but some flawed measurement methods.

How metrics help us measure Help Center effectiveness

This article has promise, but trips and falls before the finish line.  To its credit, it recommends:

  1. Setting goals and asking questions
  2. Collecting data
  3. Reviewing the data
  4. Reviewing goals and  going back to #2
  5. Lather, rinse, repeat  (as the shampoo suggests)

As a general outline, this is as good as they come, but the devil is in the details. If you’re in a hurry, just skip to the end or you can…

Continue reading “The right tool to measure your content’s performance”

More API documentation resources

I added a couple of new resources to help the technical communication community. Here is the current list of lists.

  1. Some of my favorite user experience books can be found at Selected readings on user experience design.
  2. My list of API documentation articles -non academic has grown from 18 to 30 articles. Most new ones are from the past year, but I found a few earlier articles that I seemed to have missed in my last update.
  3. My original list of academic (for the most part) papers contains the articles that I’ve cited in various papers.
  4. A recent post contains some recent academic work about API documentation that I haven’t yet filed away in any of the preceding lists.

Plotting them over time shows some recent growth in frequency as with the academic articles on API documentation, however there are 66% more academic articles on the topic in my [ever expanding] collection, so it might still be too soon to draw and comparisons.

Different discourse communities or different worlds?

I’m still intrigued by my discovery from yesterday that I describe in API documentation research-Where’s Tech Comm? The idea seems like it might have some legs for some tenure-worthy academic paper(s), so I thought I’d dig into it a bit more to see what I might be signing myself up for.

And the plot thickened. Which, for an academic, is a good thing in that it means it has potential for papers, talks, and maybe even a book? (Yippee!). But, I’m getting ahead of myself. (No book tours, yet.).

Disclaimer: what follows is a peek into my notebook and represents a work-in-progress. Any claims I make (or appear to make) are subject to change as the research progresses.

I started to take the study a little more seriously and methodically–i.e., I started collecting data. For now, my working research questions evolved from, “Why are only computer science researchers studying API documentation and why don’t they refer to technical communication research much (or at all)?” to “Have computer science researchers ever heard of technical communication research?!” and vice versa.

Google Scholar is for Computer Science research

In searching for “API Documentation” on Google Scholar, I collected something like 50 articles and papers having to do with API documentation over the years (The earliest paper I’ve found, so far, is from 1996). The following figure shows the distribution of papers by year published.

A histogram of scholarly articles about API Documentation showing the number published each year
Academic papers about API Documentation

It’s good to see the topic being researched (finally!). I started studying them around 2008-2009 when, as you can see in the chart, there weren’t many to use as a reference. As academic fields of study go, this is definitely a niche topic. And, yes the bars add up to more than 50-something because I’m still in the process of cleaning my data set. The numbers are going to be rough in the meantime, but they seem representative if they aren’t precise, yet.

The more I looked at the academic papers I was collecting, the trend of them being from computer science(y) journals and conferences was even more skewed as this chart shows.

A pie chart showing the distribution of API documentation research papers by type of publication with 8 articles from TC pubs and 49 from CS pubs
Distribution of API documentation research papers by type of publication

From the look of that graph, you could think that TC is hardly talking about API documentation. It’s even more stark when you take into account that I wrote 5 of the 8 articles in the TC slice of the pie.

Yet, I hear lots of API documentation talk on TC social media channels and Linked in has several groups dedicated to the topic so something isn’t passing the sniff test.

Is TC research on API documentation only published in blogs

As a reality check, I went to regular Google and search on API documentation and found a whole bunch of relevant topics. I haven’t formally collected them into my list, so I can’t say how many, yet, but it was clearly being talked about on the web. From a quick skim, the articles from the general web seemed to be more TC oriented than CS oriented (but, don’t quote me, on that, yet).

What slowed my progress down, momentarily, was I wanted to study the scholarly articles they were using as their references. I think that’s where some tenure-worthy research opportunities will be found. I’m (slowly and tediously) categorizing the references cited by each of the scholarly articles to get an idea of what they’re basing their research on. While that’s going to be tedious (i.e. incredibly tedious: 50+ articles each with 30+ references = 1,500+ citations to clean up and organize–if only they just attached a metadata file).

Unfortunately, the blog posts and informal references from the TC community in “regular” Google don’t tend to list as many citations as the academic articles, so looking for their foundations is going to take some more detective work. But, one thing at a time.

What it’s looking like, however, is that the academic CS scholars don’t refer to much in the way of TC research. I suppose that if they don’t provide any citations, you could say that the TC research doesn’t either. Maybe I’ll need to turn to some tcmyths for more info?

In any case, regardless of where this goes, I’ll end up with a killer bibliography of API documentation!