Sidelining

This is part 16 of “101 Ways AI Can Go Wrong” - a series exploring the interaction of AI and human endeavor through the lens of the Crossfactors framework.

View all posts in this series or explore the Crossfactors Framework

In the sport of AI automation, what happens when humans gets benched?

Sidelining is factor #16 in my series on 101 Ways to Screw Things Up with AI.

What is it?

Sidelining happens when humans become increasingly or entirely passive despite being meant to participate in an automated or AI-driven system.

Why It Matters

Most AI systems include some degree of human oversight or participation meant to leverage their decision-making and judgement. Supervision may be designed to assess performance at checkpoints, or identify deviations from a normal operating range or act on edge cases. However, various factors may lead to sidelining which would render the human ineffective in accomplishing these tasks. In these cases, the process may become unreliable or lead to downstream problems.

Real-World Examples

The commercial aviation industry has recognized the growing problem of “loss of control” incidents caused by increased automation in the cockpit. In the case of Air France flight 447 in 2009, it was found that the aircraft was operating outside its intended safe flight envelope without any of the pilots realizing it and that the aircraft system’s designers had not accounted for this possibility. Features designed to help pilots under normal circumstances made the recovery in an emergency situation more difficult. This sidelining led to a fatal disconnect between the pilots and the plane’s automated systems.

Key Dimensions

Situational awareness - a human’s ability to perceive, understand and predict the state of a system based on environmental cues and experience is often completely lost through sidelining.

Keyhole affect - over-reliance on automation narrows human attention to a smaller set of parameters and conditions, contributing to a loss of situational awareness.

Cognitive and sensory re-engagement - a sidelined human may require several seconds to many minutes to recover situational awareness through cognitive and sensory re-engagement.

Take-away

Don’t design a human-in-the-loop system assuming the human will never fall out of the loop.