The aviation industry is currently experiencing an unprecedented push toward automating flight control technologies to enhance survivability by reducing pilot workload and augmenting decision-quality information. Pilots have long since suffered from the excessive fatigue of long hours and staffing shortages (recently exacerbated by the COVID-19 pandemic), causing preventable mistakes that lead to many close calls [1]. Further complicating pilot fatigue is the reduction in average pilot experience because COVID-19 layoffs and mandatory retirements removed experienced pilots from the workforce, while training and licensing new pilots failed to keep up with attrition. The result is a decrease in the average experience of pilot pairs in the cockpit [2]. Pilot mental workload further increases during emergency situations which reduces their ability to perform concurrent tasks in the cockpit [3]. Heightened workload coupled with critical emergencies such as loss of power or insufficient thrust further elevate pilot stress [4]. Subsequently, pilots' reactions to overload and stress, e.g. instances of startle, freezing (dissociation) and denial evident in some cases [5] vary during emergencies. In response to these conditions, avionics manufacturers are enhancing their products with complex automated features to assist pilots during emergencies whereby decreasing pilot workload and mitigating the potential hazards associated with mixed and inconsistent reactions. This paper scrutinizes the impact of decision support tools and highlights the delicate balance between automation-assisted decision making and the preservation of human judgement and situational awareness. Every decision is made based on access to currently available information. The quality of those decisions increases when more information is assembled and more appropriately framed. Automation may solve the immediate need for consistency and efficiency, but it can also lead to unwanted behaviors when the operator becomes complacent and less vigilant due to over-reliance on the automation allowing learned skills to languish through neglect. This bourgeoning issue perpetuates as automation is easier to employ, which seems to eclipse industry's obligation to ascertain the compelling need for automation in the first place. Safety, consistency, improved economy, and reliability are all reasons why automation is highly sought after in the pursuit to reducing pilot workload. While these goals are achievable, there are lingering questions about the appropriate depth and breadth of automation within the industry, particularly in the critical process of enabling emergency aircraft landing systems. If there is a compelling need for automation in these situations, then it raises questions about the impact on pilot proficiency, the potential for overreliance on technology, and the preservation of crucial piloting skills. In emergency situations, however, this can be a double-edged sword. Automation can indeed assist overwhelmed pilots by taking over some tasks, but it can also lead to complacency and a decline in pilot skills. Relying too heavily on automation can result in pilots losing the ability to respond effectively when systems fail or perform unexpected actions, also known as de-skilling. Development of automated systems should consider the human condition of vigilance and how it degrades over time often resulting in decision fatigue. Investigating the levels of automation and balancing them with a human-centered design approach will not only facilitate increased safety assurance but enhance the flight control and information exchange between the pilot and the aircraft. To pique interest within the industry and elicit timely discussion, this paper postulates remedies to alleviate the concerns of fully automated emergency landing flight systems by introducing enhanced human-centric automation and provides recommendations for the development of future automated flight systems. The discussion underscores the dire need for targeted intervention and need for fostering a symbiotic relationship between humans and automation.
Reducing Workload: A Double-Edged Sword
2024-09-29
422037 byte
Conference paper
Electronic Resource
English
Outsourcing:The double-edged sword
Automotive engineering | 1988
|Cabin Avionics: A Double-Edged Sword
British Library Online Contents | 1996
|EDITORIAL - (USD)100 OIL Double-Edged Sword?
Online Contents | 2007
Strategies for mitigating risk can be a double-edged sword
British Library Online Contents | 2006
The double-edged sword of secrecy in military weapon development
Tema Archive | 2003
|