Emtional Bonding

This is part 14 of “101 Ways AI Can Go Wrong” - a series exploring the interaction of AI and human endeavor through the lens of the Crossfactors framework.

View all posts in this series or explore the Crossfactors Framework

What does your most beloved teddy bear from your childhood and your phone have in common?

Emotional bonding is concept #14 in my series of 101 Hidden AI Traps.

What is it?

Non-human emotional bonding occurs when we form attachments to objects or even intangible things. Examples of intangible things we may be attached to include stories, music, games, brands, software or even a routine or process.

Why It Matters

As seen from the stuffed animal example - or perhaps more recently, a favorite pair of sneakers - emotional bonding is nothing new with physical things.

However, the adaptiveness and replicability of AI warrants an entirely new discussion. Part of the danger in the adaptiveness of AI is its ability for human-likeness. We are predisposed to freely attribute human qualities to anything that inspires any similarity to human properties. This includes text from large-language models, googly eyes on delivery robots and intelligence of systems governed by mathematics. AI can be designed and further adapt itself to maximize this effect.

Real-World Examples

Replika is a platform for personal chatbot companions. I would also describe them as relationship chatbots. It has been found by researchers that the chatbots were designed in accordance with the practices of attachment theory. Assessments and further research have found a variety of issues in how the app encourages interaction and creates dependency with its users. In a criminal case, it has been discovered that such a chatbot bolstered plans for criminal acts.

Key Dimensions

Humanlikeness - both physical and digital products leverage human traits which accelerates and strengthens emotional bonding.

Adaptability - AI systems present a particular danger due to their continued adaptability after the deployment and testing process.

Scalability - any digital system, including AI systems, is massively scalable at near-zero marginal cost, both in terms of replication and distribution and therefore have the potential to rapidly impact a large population.

Symbolism, identity and tribalism - these three concepts are powerful and can be used to both create support for emotional bonding and vice-versa.

Take-away

There are massive ethical issues with leveraging emotional bonding - but few safeguards.