Learning Outcomes
Understand:
- Errors are opportunities for figuring out what needs to be changed and improved
- It is almost impossible to categorise errors, because there are too many things that can go wrong
Remember:
- Person versus system centric view of error
- Reason’s model of describing errors (see Ritter/Baxter/Churchill, Chapter 10)
- The Swiss Cheese model of how errors happen despite safeguards
Apply:
When errors are observed, use ways of reverse engineering the situation to determine what caused the error
Preparing for the Lecture
In this week, there will be relatively little new theory. Instead, we will use the tools we’ve discussed in the first weeks to look at the analysis of user errors. Especially in high-stakes environments such as aviation and medicine, most errors are avoided by redesigning workflows and tools to help avoid these errors in the first place. It’s no coincidence that allowing users to backtrack from errors is a separate guideline in the Nielsen/Molich heuristics we discussed last week.
If there is one thing I want to take you away from this course, it is that user errors are invaluable data for system designers and developers. Far too often, developers and administrators shrug off user error as just one of those things, or wish that users would just read the (insert swearword here) manual. And there are plenty of occasions when these feelings are justified. (I admit that I feel the same about students who don’t read the course handbook sometimes.) However, more often than not, making user errors the user’s sole responsibility will prevent you from making improvements to the interface, the system, and the workflow that would mean these errors didn’t happen in the first place.
Making interfaces less complex isn’t the sole answer. Some interfaces, such as airplane cockpits, need to be complicated. But there are many safeguards in place that support users who are dealing with this complexity. Here’s an example from an airplane pilot, who describes undergoing recertification.
What is also crucial is an atmosphere where constructive criticism is encouraged. Without feedback on what is going wrong, it’s impossible to improve. This is why I am so glad many of you completed the mid-term feedback survey, and why I am always happy to have students point out issues with materials, diagrams, or handwriting in class.
Core Readings
- Chapter 10, Ritter/Baxter/Churchill (see Edinburgh University Library)
- Reason 2000 This is a classic paper on how to deal with human error. In fact, if there is only one reading that I want you to remember from this course, it’s this one.
If you want to explore further what it takes to make devices safer, have a look at the CHI+MED project, which focused on medical device safety, in particular the brief overview booklets.
If aviation is more your thing, I can recommend the Cockpit Conversation blog, which is full of observations that show you how telling problems and errors can be.