Designing For Care with Social Robots

A Conversation with Thomas Arnold

Join us in Zoom this Thursday, October 26, at 2:30 p.m. EDT for the next in our series of AI-themed conversations we’re hosting with AI and Faith and the UU Soul Matters Sharing Circle.

Thomas Arnold

Thomas Arnold, a Visiting Scholar of Technology Ethics at Tufts University and a research associate in the university’s Human-Robot Interaction Lab, will be our guest. Driving the conversation will be Rev. Kathy Tew Rickey, minister of the Unitarian Universalist Fellowship of Boca Raton.

Here’s the link: https://us02web.zoom.us/j/89740235512

Arnold focuses on moral and social norms in human-robot interaction. Among specific projects on which he’s working is the design of an eldercare robot. Without getting too technical, we’ll explore ethical challenges in robot design and the social and emotional implications of human-robot interactions.

With academic roots in philosophy, classics, and religious studies, Arnold has transitioned smoothly into the technological sphere, advocating for an ethical framework that encompasses the multi-faceted human experience. To get a sense of the breadth of his thinking, see this recent conversation published by AI and Faith.

Here are a few excerpts that strike us as worth exploring:

  • “I’m interested in how people, organizations, and communities can hold AI systems to account.”

  • “Facing an AI system at the Go board is a different thing than watching a robot be at the bedside of your loved one in hospice care.”

  • “Human-robot interaction will force us to ask what or who is really ‘present’ at all in a given interaction.  When I shout at an automated voice service, I’m not talking to an animal or a human being. I’m talking to no one, and I may not even be aware of who I am or want to be.”

  • “It’s important that people ask in what way they are embodying the ‘human’ end of the bargain as they use and relate to technology. That may be a harder thing to face than how to categorize a machine.”

  • “After an episode of ‘Westworld’ or ‘Black Mirror’ you might have some ideas of what you would do or feel interacting with a robot, but real interactions prove to be another story. You can swear up and down that a robot is a tool, or a mere device … but when it simulates crying you might still instinctively offer it comforting words.”

We’ll publish an edited transcript of the conversation and the full Zoom recording. Our interview with Robert Geraci, Professor of Religious Studies at Manhattan College, was the first conversation in our series.

Dan Forbush

PublIsher developing new properties in citizen journalism. 

http://smartacus.com
Previous
Previous

From the Labs of Elon Musk: 'Telepathy’

Next
Next

Looking Forward to Ayudha Puja