On March 10, 2019, Ethiopian Airlines Flight 302 took off from Addis Ababa International Airport in Ethiopia. After only six minutes of flight, it crashed, killing all 157 passengers and crew on board.
On Oct. 29, 2018, and under similar circumstances, Lion Air Flight 610 left Jakarta International Airport. It crashed 12 minutes after takeoff, killing 189 passengers and crew.
What do these two incidents have in common? Both involved Boeing 737 Max 8 aircrafts.
It’s too early to pinpoint what exactly happened on board these two flights or what caused the pilots to lose control of the aircraft only a few minutes after takeoff. Formal investigations are still under way. What we do know, however, is that the two planes were equipped with the same recently developed anti-stall Maneuvering Characteristics Augmentation System (MCAS) system.
The newly built 737 Max 8, which first took off in early 2016, came with the promise of better fuel efficiency and significant improvements relative to previous 737s. Engines were relocated to a lower height and nacelles (the pods around the engines) were redesigned.
What Boeing engineers soon noticed, though, was that the new design caused the plane to nudge skyward. “To compensate for the difference in handling” and ensure the plane would not stall “in the event the jet’s angle of attack drifted to high when flying manually”, Boeing introduced a new anti-stall system, the MCAS.
Human-machine interactions
Human factors is the study of how humans use machines, and how machines and systems should be designed to make them usable and safe for humans. One cornerstone principle is that the human (and not the machine) should be put at the centre of the design process. This is called human-centred design. If you ignore the human in human-machine interaction, problems will soon arise.
As an assistant professor of cognitive ergonomics at the University of Windsor, my work examines how users interact with human-machine interfaces in everyday settings. My expertise covers the assessment and human factors design of human-machine interaction in transportation and manufacturing.
Understanding what the machine does and how it works is necessary for its correct use. Mental models are mental representations or pictures of how the system works. Accurate mental models are critical to help the user utilize systems for their intended purpose. Incomplete or inaccurate mental models, instead, cause improper system use, and often lead to accidents.
When interviewed by the Seattle Times, Jon Weakes, president of the Southwest Airlines Pilots Association, says that information on the new MCAS system and its functioning “wasn’t disclosed to anyone or put in the manuals.” This can only mean that if airlines weren’t in fact informed on how to use MCAS (or even about its existence), neither was the crew of the Air Lion 610 flight. This reflects a lack of understanding and a flawed mental model by the pilots.
Adequate training
Training is another component of safe human-machine interaction. Can we drive a car without training? No, and if we do, there will be problems. Can we fly a plane with no or insufficient training? No, and if we do, there will be problems.
Not only is it important to understand how a system works when it does, but we must also know how to disable or override it when it malfunctions. Adequate training is even more critical for complex systems or systems featuring a large number of interacting components, like cars and airplanes.
Emerging evidence shows that the training offered to 737 Max 8 pilots prior to the Lion Air crash in October “amounted to an iPad lesson for an hour.” It is still unclear whether and how pilot training changed in the aftermath of the first 737 Max 8 crash. Two reports — one from the New York Times and one from Reuters — suggest that a pilot of the recent Ethiopian Airlines Flight 302 had not practised using the new 737 Max 8 flight simulator to anticipate possible differences with earlier 737 models.
Tug-of-war between plane and pilot
So how would training have made any difference in preventing the crashes? Les Westbrooks, an associate professor at Embry Riddle Aeronautical University, suggests that flying with the malfunctioning MCAS system engaged can be described as a tug-of-war between the machine and the pilot. The pilot was attempting to pull the plane up, and the malfunctioning system was pushing its nose down.
To use an analogy, imagine driving trying to steer left with the car forcing the steering wheel to turn right. Not only was this a problem, but the procedure required to override the anti-stall system seemed over-complicated. Even knowing how to manually override MCAS would have required the pilots to put into action a complicated series of commands. All of this within an interval of only a few minutes. Boeing has committed to release an update to their software and their training protocol.
What can we do to prevent this from happening again? The human factors design community has struggled with this question since its inception 70 years ago.
We have made enormous progress toward making machines more efficient and safer. However, if cognitive science taught us one thing it is that, as humans, our processing capacity is limited. And this can make us lazy whenever a good opportunity arises.
The only way forward is to keep our eye and mind firm on the importance of human factors, be mindful of our limitations, and use a human-centred approach to machine design.
Author:Francesco Biondi: Assistant Professor, University of Windsor
Credit link:https://theconversation.com/human-centred-design-can-help-reduce-accidents-like-the-recent-ethiopian-airlines-boeing-737-crash-113987<iframe src="https://counter.theconversation.com/content/113987/count.gif?distributor=republish-lightbox-advanced" width="1" height="1"></iframe>