CONTACT CAROLINE
facebook
rss
tumblr
twitter
goodreads
youtube

  • Home
  • Write Away Blog
  • Books
    • Books
    • Trompe l’Oeil
    • Heart Land
    • Gothic Spring
    • Ballet Noir
    • Book Excerpts
  • Video Interviews
  • Press
    • News
    • Print Interviews
    • Plays
    • Ballet Noir in the Press
    • Trompe l’Oeil In The Press
    • Gothic Spring In The Press
    • Heart Land Reviews
  • Contact
  • About
  • Resources
    • Writer Resources
    • Favorite Blogs
    • Favorite Artists



The Inflection Point

Jul 13, 2021
by Caroline Miller
Derek Chauvin, Facebook, George Floyd, Kazu Ishiguro, Klara And The Sun, Megan O'Gleblyn, robot/human interface, robots
2 Comments

 

Courtesy of wikipedia.org

Kazuo Ishiguro’s latest novel, Klara And The Sun speculates on how a robot might learn to adapt to humans.  The question he poses is interesting, but we humans should spend time thinking about how we ought to adapt to them.  First, we should consider what we want from robots and what disruptions might they impose.

At the moment, technological advances come from innovators with dollar signs in their eyes. The results aren’t always a plus for the public. For example, look at the way Facebook has eroded personal privacy.

Despite the problems that company has caused, Facebook is working on a new idea. Programmers are designing computable wearables that allow our thoughts to interface with machines. A user no longer types a word to see it on the screen but thinks it.  (“Does a Robot Get to Be the Boss of Me?” by Megan O’Gieblyn, Wired, 28.86, pg. 28-29.) 

For those with limited use of their hands, the benefit is unquestionable. But can we trust Facebook to anticipate the downside to this advance?

One parameter we should consider is to what degree a human/robot interface is desirable. Take policing, for example. We use machines with good effects at times: when we need to detonate a bomb or must crawl into dangerous places.  But should we create limits on their use?

Consider the Derek Chauvin case. We know a police officer kept his knee on a man’s neck for several minutes, despite the victim’s gasps that he couldn’t breathe. Ignoring those pleas, Chauvin maintained his position for nine minutes and twenty seconds. At nine minutes and twenty-one seconds, George Floyd, the man on the ground, was dead.

What shocked so many who saw the killing was Chauvin’s robotic behavior.  For nine minutes and twenty-one seconds, he felt a man dying beneath his knee and showed no mercy.

Presumably, he was doing his duty. Many killers have defended their actions with a similar argument.  They were soldiers at Auschwitz or Mỹ Lai. They served Andrew Jackson on the Trail of Tears.  They were servants of God during the Inquisition. Or, they were patriots who employed torture in a time of war. No matter the gloss, most hearts recoil at their actions. 

What we must fear is that one day the human species may employ robots to do the unthinkable? Given the right algorithm, robots can inflict pain without remorse and commit immeasurable atrocities that, because their work is impersonal, will spare their masters’ pangs of guilt–just as war pilots are spared when they drop bombs upon sleeping villages. 

Why punish Chauvin for performing as some future robot might do?

We hold Chauvin accountable because he forces us to confront human nature’s dark side. We expect a robot to be devoid of feeling.  We expect a human to have a scintilla of compassion. We hope that when a savage impulse overtakes us, we will reach an inflection point, a moment of grace when compassion steps upon the stage to wrestle with our demon.

If conscience fails to summon that brighter angel, we become less than robots. We become monsters.

As we build our brave new world, let us hope those better angels guide our judgments. To question, to doubt, to feel is the essence of being human. Unpredictability is our hallowed gift. 

Before I created a soldier or a policeman out of bolts and wires, I’d consider the task to be served and to what degree predictability was necessary. The decision is an omniscient one that God has often failed. Throughout our history, people have walked among, behaving as if they had an inflection point, yet who in times of upheaval reveal an inhuman face. Until we understand the difference between them and us, I am in no hurry to create robots like ourselves.   

Social Share
2 Comments
  1. Pam G July 13, 2021 at 1:05 pm Reply
    Thank you for this fine, thought-provoking article, Caroline! Your description of robot insensitivity really gets to the heart of something I hold dear, which is that we can NEVER use the ends to justify the means because we have no way to know the outcome(s) in advance--of military action; individual violence; mineral extraction from the sea bed . . . whatever. Yes, we can predict sometimes, with all our biases informing our choices and actions. And now let's talk about operator error . . . !
    • Caroline Miller July 14, 2021 at 12:11 pm Reply
      Operator error? Now you get to the nub of the problem. Frailty thy name is homo sapiens.

Leave a Reply Cancel reply

*
*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Buy This Book on Amazon


“Heart Land: A Place Called Ockley Green” is available at Amazon and Barnes and Noble. The novel is also available as an eBook (Kindle and Nook.


Or buy directly from the publisher by clicking on the “Buy Now” button below.

Heart Land




Image of author Caroline Miller


Interview: Caroline Miller on Back Page with Jody Seay

Banner art “The Receptive” by Charlie White of Charlie White Studio

Web Admin: ThinPATH Systems, Inc
support@tp-sys.com

Subscribe to Caroline's Blog


 

Contact Caroline at

carolinemiller11@yahoo.com

Sitemap | Privacy Notice

AUDIO & VIDEO VAULT

View archives of Caroline’s audio and videos interviews.


Copyright © Books by Caroline Miller