This is part 15 of “101 Ways AI Can Go Wrong” - a series exploring the interaction of AI and human endeavor through the lens of the Crossfactors framework.
View all posts in this series or explore the Crossfactors Framework
Vomit fraud is an example of concept #15 in my series on 101 Hidden AI Traps - emergent behavior.
What is it?
Let’s define this carefully for systems. Behavior is the way something works or functions. Agency or intent is not a requirement. A machine can behave erratically. Emergent behavior is behavior which is arising or newly forming, especially if it is unexpected. It is often described as behavior which doesn’t depend on a system’s individual parts but emerges from their relationship.
Why It Matters
Emergent behaviors are usually unpredictable and may evolve or change (also unpredictably) over time, requiring constant reassessment. They may differ in different scenarios or when different humans are involved. The time scale at which they are seen may be from seconds to decades. Unintended consequences are common and the original goal of the system may be rendered unviable.
Real-World Examples
A common type of emergent behavior is when a human figures out a way to game a digital system. An interesting example is Uber drivers falsely accusing riders of having caused huge messes in their vehicle so that they could collect larger sums of money meant to reimburse them for cleaning fees, known as vomit fraud.
Many types of emergent behavior are also observed when AI systems created to play computer games, where they are shown to find loopholes or inconsistencies in the rules, or exploit technical glitches.
Key Dimensions
Here are five features that underlie the emergence of new behavior.
Radical novelty - each additional level of complexity in a system has the potential to introduce new properties and new emergent behaviors.
Coherence - when a system with emergent behavior is stable in a positive, perhaps even intended, way.
Dynamism - the evolution of emergent behavior, particularly due to feedback loops.
Downward causation - when the emergent behavior of a system in turn shapes the behavior of the parts (including humans).
Take-away
As AI systems continue to become more integrated and more opaque, emergent behavior will be nearly guaranteed. Who is responsible to proactively account for it?