What role do stories have in technology? And how do we design technology with ethics in mind? We sat down with human-computer interaction and ethics researcher Minha Lee of Eindhoven University of Technology in the Netherlands to talk about the quickly evolving relationship between humans and machines—with an emphasis on how machines can be used to stoke humanity and build more dynamic emotional lives. We talked with Lee about how artificial agents can positively interact and strengthen the human world.
While this might seem obvious, not enough care goes into designing artificial agents that can reveal an emotional inner world that we can believe in. Authenticity of emotions drive all narratives, be it for our self-narrative or narratives of real or imagined beings. Phoebe Sengers, researcher at Cornell Tech, wrote about how artificial agents need narrative intelligence because “in a narrative, what actually happens matters less than what the actors feel or think about what has happened. Fundamentally, people want to know not just what happened but why it happened.”
A way to think about technologies that enter our lives in a very intimate way (like Alexa listening to you sleep) is to determine the exact point at which someone or something is attributed a moral trait like trustworthiness for us to be okay with being vulnerable. Socially driven concepts like trust require interaction. One cannot simply attribute trustworthiness to another without the other having earned it in some way. Technology has a chance to become this trustworthy agent through interactions. Those humanized attributes of a thing, be it trustworthiness or compassion, have no reason if interactions do not exist.
Going forward, what will be interesting is dissecting how we don't moralize technology in the same way [as we moralize humans]. An example is that with technology, people don't blame it, but are more willing to punish it. There is a responsibility gap when we don't really know who to blame. When too many people are accountable, nobody is accountable. It's easier for people emotionally and cognitively to find one entity to place blame on, but because technology cannot understand blame, perhaps punishment is the way blame is distributed. That's how you take ownership of a negative emotion you might have. In these ways, I am curious about how our social and moral rituals will change through technology.
- Minima Moralia: Reflections from Damaged Life by Theodor W. Adorno (2005 ).
- People May Punish, But Not Blame Robots by Minha Lee, Peter Ruijten, Lily Frank, Yvonne de Kort, and Wijnand IJsselsteijn (2021).
- It’s only a computer: Virtual humans increase willingness to disclose by Gale Lucas, Jonathan Gratch, Aisha King, and Louis-Philippe Morency (2014).
- Racial Influence on Automated Perceptions of Emotions by Lauren Rhue (2018).
- How We've Taught Algorithms to See Identity: Constructing Race and Gender in Image Databases for Facial Analysis by Morgan Klaus Scheuerman, Kandrea Wade, Caitlin Lustig, and Jed R. Brubaker (2020).
- Narrative Intelligence by Phoebe Sengers. In K. Dautenhahn (Ed.), Human Cognition and Social Agent Technology (2000).
- The Sciences of the Artificial by Herbert Simon (2019 ).
- Life on the Screen by Sherry Turkle (1995).
- Reclaiming Conversation: The Power of Talk in a Digital Age Book by Sherry Turkle
- Caring for Vincent: A Chatbot for Self-compassion by Minha Lee,Sander Ackermans, Nena van As, Hanwen Chang, Enzo Lucas, and Wijnand IJsselsteijn
- The Real World of Technology by Ursula Franklin
- Psychopolitics: Neoliberalism and New Technologies of Power by Byung-Chul Han
- Duty Free Art Art in the Age of Planetary Civil War by Hito Steyerl
- The Intellectual Challenge of CSCW: The Gap Between Social Requirements and Technical Feasibility by Mark S. Ackerman
In collaboration with
- Minha Lee, PhD
- Megan Harris
- Sarah Lansky