Can we still trust flight instruments in the cyber age?
Author: Nissim Belzer, Chief Technology Officer @ CyViation
Trusting the instruments is a fundamental cornerstone of any aircraft’s training program. But what happens when flight instruments become unreliable?
Putting your trust in the cockpit’s instruments is one of the challenges of learning to fly. Novice pilots can confuse what they see and feel and what their controls tell them.
The way we experience the elements—through our eyes, our inner ear, and our kinaesthetic senses—is usually reliable and trustworthy, serving us well throughout our lives. But this is not the case when a pilot experiences disorientation in the clouds.
A pilot’s senses can lead him to believe that an aircraft is flying straight and level when it is veering to one side or even in a steep descent. Fortunately, visual and audible alarms are triggered in such cases to alert the pilot. This system has proven so reliable that pilots are trained to disregard their senses in an emergency and to trust their instruments instead.
On October 29, 2018, and March 10, 2019, two 737 MAX aircraft—Lion Air Flight 610 and Ethiopian Airlines Flight 302, respectively—experienced catastrophic errors that cost the lives of 346 passengers and crew.
The entire 737 MAX global fleet was grounded, pending separate NTSB (National Transportation Safety Board) investigations. The final report, released on October 23, 2019, found that a combination of MCAS (Manoeuvring Characteristics Augmentation System) design flaws and AoA (angle-of-attack) sensor failure had caused the pilots to set the pitch of the horizontal stabilizer to full “nose down” position [ADD citation]. The 737 MAX returned to service worldwide in 2021, with the FAA requiring that all MAX pilots receive MCAS training in flight simulators.
The possibility still exists that instrument failure could lead to more tragic events—instrument failure caused by a cyberattack.
Some research institutes have addressed the human factor during cyberattacks by testing a pilot’s reaction time, behavior, corrective actions, etc. The conclusion is that higher workload and weakened trust in the system result in rapid erosion of the basic tenet of pilot training—trust your instruments.
To quote one study, “… cyber-attacks influence pilots’ workload, trust in the system, visual information acquisition, behavior, and performance,” going on to conclude, “… warning about an impending cyberattack can moderate several of those effects ….”
But changing the ‘trust the instruments’ paradigm is a tough nut to crack. To date, no major cyberattacks have been publicly disclosed. Pilots are not trained for them and may not even be aware their aircraft is under attack because the fleet lacks crucial detection systems. Most pilots aren’t even familiar with the cyber QRH (Quick Reference Handbook) and, therefore, can take no predefined actions in the event of an attack.
OEMs (Original Equipment Manufacturers) and airlines are increasingly allocating resources to prepare for future cyber threats. Regulations, such as DO-355/ED-204, address this imperative and urge operators to monitor and train flight crews for the inevitable attack proactively. This training should include IT, OT, simulators, and in-flight IDSs (Intrusion Detection Systems).
It is now widely acknowledged that pilots trust the instruments’ training may be insufficient and tragically ineffective during real-life events. Pilots must be trained and prepared to avoid air disasters caused by a potential cyberattack.
Follow Us
.
