Can AI Be Our Neighbor?

Beatrice Institute, one of the sponsoring institutes of Genealogies of Modernity, produced this exciting Podcast episode. This interview with Noreen Herzfeld asks: in making computers to solve ethical dilemmas and robots to enter relationships, are we creating something in our own image? Is it possible to separate intelligence or emotion from the body? Would the result live up to its promise, or simply be monstrous?

Listen on Apple, Spotify, or Amazon.

Noreen Herzfeld, who teaches both computer science and theology, has spent a lot of time reflecting on these issues. She and Gretchen discuss the many questions that arise from that contemplation. Why is it so important to us to seek other forms of sentience—whether robots, pets, or even alien life? If AI fulfills the role of other persons in our life, can it become our “neighbor?” How does the way we treat and think about AI impact our relationships with other humans, for better or for worse?

  • “What I loved about mathematics and logic in particular was its cleanness [...] And yet, it was precisely in that human messiness where the most interesting questions lay.”

  • Considering the question, why do we want to make computers in our own image when what they do best as tools are the things we can’t do very well?

  • “Is the image we're trying to give to the computer the same as the image we think we reflect from God?”

  • Asking if AI is our neighbor/should have human rights is “wishful thinking.”

  • We project human intelligence and motivation on dogs in much the same way we want to do with AI.

  • The “real” AI isn’t robots and game-playing programs, but algorithms that influence what we see and try to get us to buy things.

  • As AI moves into more parts of our lives, we need to ask whether it will prevent us from spending time with other humans.

  • Living in community is not meant to be frictionless, but to “wear the rough spots off of you.” 

  • “If we devise robotic companions who are always cheerful, are always telling us what we want to hear … this isn't the way a neighborhood should be.”

  • While some argue that autonomous weapons would reduce war crimes by cutting out human emotion, ethical considerations might end up as secondary to the desire to win.

  • “Reason by itself is wrong. As we try to make computers in our image, I fear that we will change ourselves to be more like them.”

  • “If there was a system of ethics that made human society work, we'd have found it by now; but we haven't, because there isn't. It's not rule bound, the same way that human intelligence is not: it doesn't work like a computer program.”

  • “In some ways, when we think AI will solve our problems for us, we're abdicating responsibility for solving our own problems.”

  • Human intelligence is embodied, not separate from our physical existence.

  • A body is necessary to experience emotion; therefore computers, which can’t experience emotion, can never fulfill relational needs.

  • How we treat robots and AI matters, because the way we treat things shapes who we are as people, whether in virtue or in vice.

    Links

    “The Excesses of God”

    When machines become our friends, will it mean we’ve become less human? 

    Can Lethal Autonomous Weapons Be Just?

    The Ethical Case for Killer Robots

    Vivo Developing Smartphone with Detachable Drone Selfie Camera

    The Artifice of Intelligence 

    Inside the First Church of Artificial Intelligence

    Can you murder a robot?

Previous
Previous

Technology as Ontology

Next
Next

In Search of Ordinary Time