Visible to the public Differences in Trust Between Human and Automated Decision AidsConflict Detection Enabled

TitleDifferences in Trust Between Human and Automated Decision Aids
Publication TypeConference Paper
Year of Publication2016
AuthorsPearson, Carl J., Welk, Allaire K., Boettcher, William A., Mayer, Roger C., Streck, Sean, Simons-Rudolph, Joseph M., Mayhorn, Christopher B.
Conference NameProceedings of the Symposium and Bootcamp on the Science of Security
Date PublishedApril 2016
Conference LocationPittsburgh, Pennsylvania
ISBN Number978-1-4503-4277-3
KeywordsAutomation, decision-making, reliance, risk, Strain, Trust, Warning of Phishing Attacks, Supporting Human Information Processing, Identifying Phishin Deception Indicators, and Reducing Vulnerability, workload

Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.

Citation KeyPearson:2016:DTH:2898375.2898385
Refereed DesignationRefereed

Other available formats: