737 Max: A UX Perspective
Former Boeing usability engineer, Roger Belveal, offers a UX design perspective on the 737 Max MCAS. Belveal was employed by Boeing during the 777 program era and was a co-founder of the first Boeing usability testing lab in Seattle in the mid 1990’s. Considering the user experience of the pilots, he explains how the MCAS (Maneuvering Characteristics Augmentation System) fails all ten of the well-known ten industry principles of usability. This is Part One of a three part blog on the 737 Max MCAS.
“I couldn’t trust my airplane anymore.”
These words of Captain Stefan G. Rasmussen say it so clearly. These might have been the thoughts of the pilots of either of the crashed Boeing 737 Max airplanes. Tragically, neither is available for comment. Captain Rasmussen was the pilot of Scandinavian Airlines Flight 751, another airplane whose on-board software similarly stole control from the pilot resulting in a crash, but fortunately with survivors.
System Takes Control of the Airplane
Shortly after takeoff of Flight 751, Captain Rasmussen discovered ice from the wings had broken away and entered both engines which are at the rear of the MD81 airplane. To give the ice a chance to clear and avoid engine damage, this seasoned pilot throttled down both engines momentarily. Unfortunately, this triggered an on board computer system that pilots had no knowledge existed. The system detected the slowing of the engines, overrode the captain and throttled up, immediately destroying both engines.
Miracle on the Frozen Lake
At low altitude with no power, Captain Rasmussen was forced to crash land the plane on a nearby frozen lake, much like Captain Sullenberger did in the ‘Miracle on the Hudson’. Captain Rasmussen’s heroic landing was never portrayed by Tom Hanks, so most people have never heard of him though the full story of Flight 751 is well documented.
“teamwork between man and machine”
Captain Rasmussen was so traumatized that he was unable to resume his flying career. The reason was not the crash on the frozen lake, but the violation of the “teamwork between man and machine” and that, in his words, "I Couldn’t Trust My Airplane Anymore”.
“The startle factor is real”
“The startle factor is real”, said Captain ‘Sully’ Sullenburger, commenting on the 737 Max MCAS. https://globalnews.ca/news/5409480/pilots-criticize-boeing/. See YouTube video https://globalnews.ca/video/rd/1547747907950/?jwsource=cl
Common Sights in Usability
I have been a usability analyst a very long time, about three decades altogether, including ten years with Boeing Commercial Airplanes. In observing hundreds of usability studies, such frustration is a common sight. When a system suddenly ignores input, or worse, does the exact opposite for no apparent reason, human beings have a need to understand why. As they begin questioning everything, confidence and performance take a beating. In our digital lifestyle, we all experience these frustrations daily. In most of these circumstances, the impact is benign. For a pilot with a hundred lives on board, such a disconnect could be excruciating.
Defective by Design?
We would assume such disconnects of system behavior and expectation are due to a defect. However, it is also common that these are intentional features based on designers’ assumptions that turn out to be different from the circumstances of a moment of real life.
Risk Management
Much of the field of UX (User eXperience, AKA usability engineering,) is about designers and engineers better educating themselves on the actual circumstances of use. At the same time, there are a set a of standard usability design principles that are to remind designers that predicting the future needs of users is difficult to impossible to do with 100% accuracy. Despite the miracles of intelligent system capabilities, keeping users in charge remains the safest approach.
The Ten Principles of Usability
There are ten well-known principles for software design in the industry. These principles or ‘heuristics’ were developed by usability guru, Jakob Nielsen, based on analysis of many hundreds of usability studies. Because these are principles of interaction and not any one technology, they have stood the test of time. Multiple generations of technologies later, they remain among the best tools for analyzing and explaining the usability of a given design. They are listed below. The full descriptions of each can be found on the Nielsen web site, https://www.nngroup.com/articles/ten-usability-heuristics/
10 Usability Heuristics for User Interface Design
Visibility of system status
Match between system and the real world
User control and freedom
Consistency and standards
Error prevention
Recognition rather than recall
Flexibility and efficiency of use
Aesthetic and minimalist design
Help users recognize, diagnose, and recover from errors
Help and documentation
Heuristics are an Enduring Tool for Analysis
When we built the usability lab at Boeing in the mid 90s and established a set of usability design practices, these principles were one of our key toolsets. We used them routinely to help clients and stakeholders understand the performance of their systems and recommended improvements. In my career since, I have applied them with success to systems of all types in many business domains as well as physical environments and processes. They have recently received a lot of attention because of their timelessness and applicability to any technology. Even as we move away from traditional screen-based UI into realms such as AI, VR, AR, voice-only interaction, etc., these heuristics remain poignant.
Applying Heuristics to Gain insights
I invite the reader to consider these principles in terms of your own experiences, and then to consider the pilots’ experiences described earlier. Practicing empathy as a deliberate technique, and not just a feeling, with these principles in mind is core to the practice of design thinking. The purpose is not just to determine whether there was a failure. That is already clear. We want to gain a deeper understanding of the dynamics at work, with insights beyond the surface circumstances of the incidents. Design principles can aid in that understanding. Whether and how such insights are applied is ultimately a matter of design culture.
Reviewer’s Disclaimer
This quick review is based on my limited knowledge gained from news articles. I do not have any special access to the system or to its specs or to its designers as I would typically have when conducting a professional product review. No one has hired me or requested I do this. It is my own interest in this subject and its grave implications to the industry that is my motivation. I have no other interests.
My Heuristic Review of the Boeing 737 Max MCAS User Experience
Visibility of System Status:
[ ] PASS [X] FAIL
The only visibility provided to the pilots regarding this system’s status, operation, or even its existence, is that it suddenly took control of the airplane and crashed it.Match between System and Real World:
[ ] PASS [X] FAIL
A system aligning with the real world empowers users to apply existing knowledge toward understanding and operating it successfully. In this case, the MCAS presented nothing. It functioned mysteriously apart from every other system of the plane.
User Control and Freedom:
[ ] PASS [X] FAIL
Keeping the user in control is among the most sacred of all of these principles and often the hardest to safeguard. Without it, there is little purpose to the other principles. Unfortunately, it is common for designers, engineers, and others to be so convinced about what systems behaviors should be that they diminish user control. In this case, the user control was diminished all the way to zero.Consistency and Standards:
[ ] PASS [X] FAIL
As the user interface was essentially non existent, there is no clear evidence that it conformed to any relevant standards for flight deck design and operation.Error Prevention
[ ] PASS [X] FAIL
Non-existent design offered nothing in this regard.Recognition Rather than Recall:
[ ] PASS [X] FAIL
This principle simply means that human minds are much better at responding to information presented than at remembering it from the past. In the case of the MCAS, information provided users was non-existent, past or present, to be recognized or recalled.
Flexibility and Efficiency of Use:
[ ] PASS [X] FAIL
Providing options to a user suitable to the level of skill and need can offer many benefits. The type of user interface patterns used by novices occasionally may diverge substantially from those used frequently by professionals.
Airline pilots in my experience are the best example of an expert user and the style of user interface that is suitable for them is one extremely rich in information with multiple options for their discretion, including when and how to employ automation with full override capabilities. The MCAS provided no information and not even one option.Aesthetic and Minimalist Design:
[ ] PASS [X] FAIL
MCAS is a very clean and simple design, certainly uncluttered by any information or controls. Minimalism however, is not defined as nothing. Rather, minimalism is the least that does the most and does it well. A non-existent UI design is not a minimalism. The actual experience it produced, that of fighting for control of the airplane and crashing, was anything but minimalist.Helping User Prevent, Recognize, Diagnose, and Recover from Errors:
[ ] PASS [X] FAIL
The types of features that are possible to avoid errors or to help deal with them if they occur are many. Unfortunately, none of those practices were put to use with the 737 Max MCAS. The user was left completely in the dark as to what was happening and why.Help and Documentation:
[ ] PASS [X] FAIL
There was no documentation on board the plane or elsewhere for the pilot’s awareness of the MCAS operation or even of its existence.
It is a cruel irony that pilot error is typically the first suspect in an incident and increasing training the go-to remedy. The prospect of training pilots on a system that is invisible, autonomous, and cannot be overridden is dubious to absurd.
Conclusion: FAIL 10 of 10
737 Max MCAS fails all ten of the industry standard heuristics for systems user experiences.
Applicability of UX Heuristics?
Some may argue that principles of usability do not apply to automated systems. To that assertion, I have two answers. The longer answer I will explain in part 2 of this blog, where I discuss culture, and part 3, where I will address artificial intelligence and autonomous systems. The shorter answer is that two airplanes have crashed over problems that might have been easily solved if pilots had been empowered with information and a control switch. Many more airplanes are grounded. Public confidence in the Boeing brand has suffered harm and losses in the billions. In other words, how's that philosophy working for you so far?
Part 2; Culture
In Part 2, ‘A Series of Unfortunate Design Decisions: A UX Perspective of 737 Max’, Belveal will discuss the impact of engineering and IT cultures, the subtle but significant differences between airplane design and information technology and their potential effects on design decisions and outcomes.
References
There is a splendid documentary on Flight 751 at https://youtu.be/a6oJUt4WWdQ
I have also posted this video as a permanent part of my web site at https://www.belveal.com/art-experience-design
See YouTube video about the machine violating that trust “Pilot betrayed”
Stefan G. Rasmussen, pilot of "Scandinavian Airlines Flight 751