During the “break” between semesters, I try to catch up on the tasks I’ve deferred during the past semester and get ahead of the tasks I know will come up during the coming semester. In the process, I’ve had these encounters with documentation that I’ve characterized into these categories:
- GOOD: documentation doing what it should—making me(the customer) successful.
- COULD-BE-BETTER: documentation that is well-intentioned, but needs a tweak or two.
- UGLY: documentation that gives me nightmares.
Here goes.
Good documentation
These are the stories of documentation that made me successful. Being successful makes me happy. Good documentation should help make the reader successful.
Ford Motor Company. F-150 owner’s manual
The windshield wipers on my almost 3 year-old truck are the ones that it came with from the factory— almost 3 years ago. Well past their expiration (and effectiveness) date. But, while I’ve changed car engines and gear boxes before, I hate changing wipers. They always have some clever (and obscure) trick to getting the old ones off and the new ones off. I especially hate it when they go flying off the car in a rain storm. So, for all these reasons,I’ve procrastinated on changing them for far too long (as driving in recent wet weather has reminded me)—the replacements have actually been in my garage since…I actually can’t remember.
Yesterday, it was sunny making it a good day to work on the cars. To overcome my fear of windshield wipers, I dug out the [still pristine] owner’s manual to my truck. Applying the available search technologies (the index in the back of the manual), I found the page for windshield-wiper maintenance, turned to that page and was greeted with a pictorial procedure for the removal and replacement of the wipers. Faster than I could open the browser on my smart phone! Armed with this knowledge, I replaced the wipers in just minutes.
Success!
Postman
Another task that I’ve been deferring for too long is the refactoring of my piClinic Console project’s code. The prototype code that’s gotten it this far is running out of runway and next summer it will be heading to real clinics to manage real data. I need to whip the system into shape and get it ready for prime time. As part of that process, I want to add some automated tests. My son, who does this for a living, recommended Postman. So, I downloaded the app and signed up.
After a little confusion about whether to install an app or use a Chrome extension (use the app), I got it going. For pinging REST endpoints, it was easy enough, but I wanted to start building tests that would give me a “thumbs-up/down” indication of the API as I refactor it. In [what I would imagine to be] true developer style, I started banging on it (sans documentation, of course) to get as far as I could. The UI guided me through itand I ended up having just two questions:
- How do I get the data from a response body to use in another test?
- How do I get a pass/fail indication?
Looking these up (i.e. Searching in Google and reading a combination of Postman documentation and StackOverflow posts (again, in true developer fashion) ended the evening with two [simple] automated tests that I could save in the project’s GitHub repo.
0 to 2 automated tests in as many hours felt like…
Success!
Documentation that could be better
This is documentation related and well-intentioned, but needs a tweak or two.
What Technical Communication Managers Must Do to Prove the Value of Their Deliverables
Hannah Kirk’s article, What Technical Communication Managers Must Do to Prove the Value of Their Deliverables posted to Writing Assistance, Inc.’s website (and appearing in my Twitter feed, this morning) is a well-intentioned case study on the value of technical communication but falls short as so many similar case studies published since I started following the topic (about 15 years, now).
The article describes how tech writers in what the author describes as, “A software company I worked with recently” identified and addressed several problem areas. Tech comm definitely needs to highlight these experiences, and measuring the value of technical communication is one of my favorite topics, so I approach these articles with excitement that, invariably, deflates into disappointment when I see them apply the same, seemingly ineffective, approach.
The approach I seen over and again, and which this article takes, is to identify pain points and then demonstrate how tech comm can (and seems to) address them. What could be wrong with that? Well, for most of tech comm, nothing; except we aren’t the audience that needs this information. I think it’s safe to generalize that Tech comm academics and professionals like to hear when tech comm “wins,” but we don’t really need to hear that message. It’s everyone outside of tech comm that needs to hear it, and, frankly, they haven’t.
The approach that seems to be ineffective (in that the audience who needs to hear the message hasn’t yet) is to not include hard numbers.
Show me the numbers!
This article lists some seemingly valid pain points(duplication of effort and content, inconsistent content seen by customers,company image that appears disorganized, to name a few) and goes on to describe how the application of technical communication practices, those problems have been reduced. All good info.
What the article doesn’t show, and what our external audiences need to hear, is how much money did this save and/or make for the company? Without numbers, this is just a nice story. Numbers (especially, valid and demonstrable numbers) are what gets attention.
What would improve the impact of these articles, even though they are anecdotal case studies, is numbers. Time and money: how much was the old process costing and how much did the new process save?
For example, what did the duplication of content cost? How many customers felt the company appeared disorganized? How long did it take for customer service reps to find the right info? What are the most expensive pain points?
You can’t credibly mention the “bottom-line” in an argument without including some “bottom-line” numbers (or at least percentages if the numbers are confidential).
Could be better.
Ugly
This experience started in the documentation, but evetually left me with nightmares of my computer being taken over. I share it with you as an example of what not to do—under ANY circumstances.
HP Printers
I’ve been a happy customer of HP computers and printers for going on 20-years. I have had no ax to grind with them and I recently bought a rather expensive printer in part due to my past experiences with their products. The printer is a multi-function peripheral (MFP) so it acts like three devices in one (Printer, Scanner, and Fax) and I’m still getting it configured (and figured out). While trying to send a fax (yes, apparently some people still use them!?) I got an error that didn’t make sense.
I go on line and HP’s [very stylish] help site suggests I download the Print and Scan Doctor.
OK, fine. However, before downloading, I get a dialog saying something to the effect of “Before you can download this tool, you must consent to answering a survey.”
Say WHAT??!!
First I said, Cancel. I would have sent them Gerry McGovern’s article, No, I don’t want to take your survey if that was an option
But, over a barrel and under a deadline (a bad place to be), I was desperate and clicked, OK. I figured I could just ignore it. I was about to learn how seriously they took this commitment.
The software downloaded and installed and, ultimately, wasn’t helpful. I still had a Fax task to complete (in addition to the other tasks I was working on at the time), so I gave up on the tool to get back to it.
Not so fast, buck-o. You consented to a survey, so you’re taking a damn survey! And, you’re taking it NOW!
Closing the diagnostic tool opened a non-closable, non-cancelable dialog box with the start of a consumer information survey. Not only could I not find anything to click to close it, I couldn’t find its process in the task manager to close it, and, ultimately ended up stopping the two other projects I was in the middle of (and the Fax task I was still trying to figure out) to restart my system. Just to make that survey dialog go away.
I hope that I don’t need to tell anyone, anyone beside this survey’s designers, that for a successful customer support experience:
- You appreciate the customer’s situation, work with it, and don’t take advantage of it to extort survey participation. Extorting survey participation is a bad practice.
- Even if you ignore (1), you always let the participant leave. Holding survey participants hostage is a bad practice.
If that data is so important, offer to compensate the participant for their time, and always let them leave when if they want to leave. The feeling of powerlessness and intrusion I had in this experience when all I wanted was to find a solution was profound.
I understand the desperation to collect feedback and the pressure to produce some numbers, but I won’t resort to kidnapping or extortion to accomplish that. Hopefully, neither will you.