Results of recent research investigating the use of automated systems have indicated the presence of automation bias, a term describing errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers take inappropriate action because they over-attend to automated information or directives, and automation omission errors, i.e., errors made when decision makers do not take appropriate action because they are not informed of an imminent problem or situation by automated aids, can result from this tendency. In a series of studies, participants who perceived themselves as "accountable" for their strategies of interaction with automation were significantly more likely to verify its correct functioning, and committed significantly fewer automation-related errors than those who did not report this perception. In this study, we focus on two manipulations to encourage verification behaviors within cockpit crews. The first, an information/training manipulation, will focus on explicit information on automation bias, and training for verification of automated functioning. The second manipulation will consist of display prompts to verify automated functioning. Participants (glass-cockpit crews) will fly a series of approaches on the Mini-ACFS, a two-person pan-task flight simulator. Approach scenarios include several automation "events," that is, glitches, malfunctions, or inappropriate recommendations, that can be caught if cross-checked with other cockpit indicators. We anticipate that these manipulations will ameliorate automation bias. Final data analysis will be completed prior to the OSU symposium, and is expected to support the hypotheses that: a) reduced errors are associated with verification behaviors; b) crews will be more likely to verify under training/display conditions than when in the control condition; and c) effects will persist when subjects return for a follow-up exercise 6-9 weeks after their first session. Implications for training and automation design will be discussed.


    Access

    Access via TIB

    Check availability in my library


    Export, share and cite



    Title :

    Automation Bias and Countermeasures in Flight Crews


    Contributors:

    Conference:

    International Symposium on Aviation Psychology ; 1997 ; Columbus, OH, United States


    Publication date :

    1997-01-01


    Type of media :

    Conference paper


    Type of material :

    No indication


    Language :

    English




    The formation process of flight crews

    Ginnett, Robert C. | NTRS | 1987



    Human Factors Knowledge Requirements for Flight Crews

    Kantowitz, B. H. / International Civil Aviation Organization | British Library Conference Proceedings | 1993


    Understanding and Counteracting Fatigue in Flight Crews

    Mallis, Melissa / Neri, David / Rosekind, Mark et al. | NTRS | 2007