Results 1 to 9 of 9

Thread: Ethics and Autonomous Machines

  1. #1
    Senior Member
    Join Date
    Jan 2011
    Posts
    5,724

    Ethics and Autonomous Machines

    I saw an interesting article in Bloomberg about self-driving cars: how should they be programmed to respond to situations where a choice must be made between the occupants’ safety and other vehicles or pedestrians? Do you crash into the tree or veer into the sidewalk full of people?

    Will future buyers demand selfish versus altruistic options? Will insurers shun vehicles that choose the most costly disaster? Will government demand approval authority over the code? Will Asimov’s laws be written into the Uniform Commercial Code?

    When the military begins deploying autonomous drones, smart munitions or a more clever class of mine, will they need to be programmed for some acceptable level of collateral damage or force proportionality as they go about their lethal business? Will machines need to choose when troops in the field must bear more risk to preserve the lives of non-combatants? Will computer intelligence some day be called upon to make some of the terrible decisions military officers have always needed to make?

    Will machine ethics become an important new branch of philosophy?

  2. #2
    Senior Member
    Join Date
    Aug 2016
    Posts
    3,599
    Not Asimov, rather Ray Bradbury, in "The Veldt."

  3. #3
    Senior Member catherine's Avatar
    Join Date
    Jan 2011
    Location
    Vermont
    Posts
    10,235
    Wow. You're making my head explode.

    I'm reminded of utilitarian ethicist Peter Singer's ethics analogy about a train track that divides into two. There is a little boy sitting on one track. All of your financial security is tied up in a vintage car, and that car is sitting on the other track. You are the one who controls which track on oncoming train takes. What do you do?

    Most would say that of course you save the little boy. But Singer argues that in reality we are all, collectively, choosing to save the vintage car (one's own financial security).

    Interesting analogy. Not the same of course as the ethics of a self-driving car, but the car/train connection made me think of it.

    I agree that this is a very interesting and important new "problem" to solve. Maybe the Autodrive will cancel out and put the decision in the driver's hands at the last minute. No self-driving car wants a guilty conscience.
    "Do any human beings ever realize life while they live it--every, every minute?" Emily Webb, Our Town
    www.silententry.wordpress.com

  4. #4
    Senior Member razz's Avatar
    Join Date
    Dec 2010
    Location
    Ontario, Canada
    Posts
    6,040
    People will be people, some struggling to be ethical and some self-interested. It is no different than my experience at the self-checkout at the grocery store. Someone carefully entered all the chosen items on the till, placed them in bags and left without paying.

    I came up to the till and asked a staff member if there was a problem with the machine. Turns out this non-payment is not unusual. Corrections added to address this problem include photos of those entering items at the till, new green light indicators when a till is truly free...
    Ethics is a complex field and the applications in the OP indicate a need for a monitoring process. I am curious what that will involve.
    Gandhi: Happiness is when what you think, what you say and what you do are in harmony .

  5. #5
    Senior Member
    Join Date
    Jan 2011
    Posts
    5,724
    Quote Originally Posted by catherine View Post
    Wow. You're making my head explode.

    I'm reminded of utilitarian ethicist Peter Singer's ethics analogy about a train track that divides into two. There is a little boy sitting on one track. All of your financial security is tied up in a vintage car, and that car is sitting on the other track. You are the one who controls which track on oncoming train takes. What do you do?

    Most would say that of course you save the little boy. But Singer argues that in reality we are all, collectively, choosing to save the vintage car (one's own financial security).

    Interesting analogy. Not the same of course as the ethics of a self-driving car, but the car/train connection made me think of it.

    I agree that this is a very interesting and important new "problem" to solve. Maybe the Autodrive will cancel out and put the decision in the driver's hands at the last minute. No self-driving car wants a guilty conscience.
    People try to pass the buck all the time. Why shouldn’t AI?

    The “Trolley Problem” in various forms gets used a lot. It even made an episode of “The Good Place”.

    I always thought Singer’s utilitarianism to be a bit simplistic. He certainly doesn’t live it personally.

  6. #6
    Senior Member Teacher Terry's Avatar
    Join Date
    Dec 2013
    Location
    Nevada
    Posts
    8,820
    I read that self driving trucks are coming to Texas soon. Also read that if a self driving car has to make a choice between hitting a pedestrian or crashing the car when the occupants are likely to die the car will hit the pedestrian.

  7. #7
    Senior Member SteveinMN's Avatar
    Join Date
    Mar 2012
    Location
    Saint Paul, Minnesota
    Posts
    5,721
    Quote Originally Posted by catherine View Post
    Maybe the Autodrive will cancel out and put the decision in the driver's hands at the last minute. No self-driving car wants a guilty conscience.
    I cannot see that ending well. Many (most?) drivers are inattentive (especially if the vehicle is driving for them) and prone to overreaction or simply freezing at times of crisis -- and the car is going to hand them a decision that should be made in a matter of seconds (maybe less because the driving system will have more information than the human will)?

    I have long believed that the critical path for autonomous vehicles will not be the technology (it's pretty decent even now); it will be the law -- both codification of (autonomous) behavioral standards and the ability of those "wronged" to sue based on the driving system's decision. Given the market-chilling effects of insurance companies not writing policies on, say, flood-prone properties, the weight insurance companies will carry on the use and behavior of autonomous vehicles will be definitional.
    Success is to be measured not so much by the position that one has reached in life as by the obstacles which he has overcome. - Booker T. Washington

  8. #8
    Senior Member
    Join Date
    Dec 2010
    Posts
    1,463
    Quote Originally Posted by catherine View Post
    Wow. You're making my head explode.

    I'm reminded of utilitarian ethicist Peter Singer's ethics analogy about a train track that divides into two. There is a little boy sitting on one track. All of your financial security is tied up in a vintage car, and that car is sitting on the other track. You are the one who controls which track on oncoming train takes. What do you do?

    Most would say that of course you save the little boy. But Singer argues that in reality we are all, collectively, choosing to save the vintage car (one's own financial security).

    Interesting analogy. Not the same of course as the ethics of a self-driving car, but the car/train connection made me think of it.

    I agree that this is a very interesting and important new "problem" to solve. Maybe the Autodrive will cancel out and put the decision in the driver's hands at the last minute. No self-driving car wants a guilty conscience.
    Make a slight modification and say it is YOUR little boy on one track and YOUR financial security on the other track and YOU are the one at the controls. I think the results change and would really be saving the boy then. Not sure that it constitutes a true change in ethics, though, but something to think about.
    To give pleasure to a single heart by a single act is better than a thousand heads bowing in prayer." Mahatma Gandhi

    Be nice whenever possible. It's always possible. - Dalai Lama

  9. #9
    Senior Member jp1's Avatar
    Join Date
    Dec 2010
    Location
    San Francisco
    Posts
    5,144
    I think steve has it right. Currently humans drive cars and when they crash they (or their insurance) pays for the damage caused to other people or property. With an autonomous vehicle where the actions of the vehicle are dictated by the software it will be the vehicle or the programmer's responsibility. Once autonomous vehicles are the norm people won't need auto insurance anymore. I see insurance submissions for this type of risk a couple of times a month. I decline to quote them all. Not because I think the companies developing this don't know what they are doing, but because I don't know what the law is eventually going to say about who is responsible. Just as it is still getting worked out when and where uber is responsible for accidents in various scenarios (did they have a passenger or not, etc) the same is the case for self-driving vehicles. Self driving vehicles will likely be far safer than human driven vehicles but they'll never be as safe as planes. Kids will still run out from between cars to chase balls, or whatever. Whether the car steers away from the kid but has a head-on collision with another car or decides to mow down the kid is a decision that someone will have to make before that decision has to be made.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •