Steve Krug has written two well-received books about usability, Don’t Make Me Think: A Common Sense Approach to Web Usability and
Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. This workshop was based entirely upon the second book and dealt with making usability testing as simple as possible and weaving it into your monthly design operational cycle. In fact, the book is really the script for the workshop/course that Steve does. The material was pretty much perfect for me, as I’ve been kvetching about not being able to do exactly this as a routine part of our cycles at BCIT.
As my work straddles being a UX designer and project lead, I spend my time in bits and pieces of both without doing either as well or thoroughly as I’d like. My experience leading usability tests has typically been to bring in several users at once and conduct them with at least one observer in the same room. I haven’t typically used screen capture software, but used a combination of observer notes and audio recording, strongly encouraging participants to use think aloud to provide feedback as they complete tasks from the test script.
What the workshop really did for me was three-fold:
- It showed the real advantage of getting both audio and desktop video, providing a much richer data set for the research,
- it made me realize that moving from basic task completion to more fully fleshed out scenarios, allows you to connect the test more closely with your own goals and personas (assuming you have gone beyond audience segmentation and actually have them), and;
- it helped me understand why doing this as regularly as possible is so helpful.
I found the rapid sample user testing on a real website extremely valuable and the key piece of the workshop. As I often find when doing any kind of research, my own biggest challenge is keeping my mouth shut and avoiding introducing bias to the test. I had the same issue when doing my MA thesis interviews in 2009. A nice little addition is some handy downloads for running usability tests, available here.
If I had two suggestions for this workshop, it would be to include Steve’s book with the fee (particularly since Lou’s workshop the next day did) and for Steve to stray a little more into other aspects of usability testing and do a couple of varied exercises. Since I’ve done some of this work on a few occasions now, I found the workshop parroted the book’s content a little too closely. However, I also recognize that designing a workshop for diverse experience levels and awareness is a real challenge, and the book is a very useful guide to weaving user testing into your normal cycles.
My takeaway will be to try figure out a way to introduce more of this at work. Particularly in an environment where overall design changes happen infrequently, and you begin to take the relationship between your site and your users for granted, it can be difficult to step back and spend time on operational design work. I know it won’t happen a day per month as Steve suggests, but certainly some usability testing on the lowest hanging fruit on our main public web property on, perhaps, a quarterly basis would be very useful (giving us enough time in between to actually fix some of the findings). And, like pretty much every large website with multiple audience segments, we have lots of low hanging fruit. I’ve talked often about just that – routine quarterly testing for iterative improvements. Since I was also lucky enough to snag Remote Research as a draw prize at the day three workshop, I expect my user testing to be done more remotely. My hope is that the convenience of remote testing might give me a fighting chance to actually set up a quarterly user testing process.