Who should driverless cars kill? [Interactive]

I'm sorry, but you are an utter psychopath. This isn't an insult, I genuinely believe you are a dangerous person with a god complex and no understanding of how morality works. So you're saying that I deserve to die because I have a car? The question is very simple: If I had a regular car, would my chances of survival in this situation be better or worse that in an autonomous car? If it's the former, then it's my moral obligation to choose the regular car.

An autonomous car can make a better decision than you in a split second, as it can weigh more options in the 0,2s that it takes you to respond to external stimuli. Your decision is, understandably, that of self preservation. However, if the choices are between you and "a dozen" pedestrians, then -clearly- the better option is for them to survive, and not you.

This is clearly a case where the law should dictate boundaries, because egotistical people like you can't handle the responsibility. The responsibility to kill yourself? Please, point to a single law that says anything as deranged as that

The responsibility to let the good of many prevail over the good of one.

You're also conveniently glossing over the other point I made. Imagine YOU being the pedestrian; would you want to "run into a car" that always puts its driver first? You're conveniently glossing over the point I make: EVERY CAR RIGHT NOW PUTS THE DRIVER FIRST, because the driver puts the driver first, and no rational person would have it any other way.

Again, you're ignore my point which highlights the glaring shortcomings in your argument. PUT YOURSELF IN THE PLACE OF THE PEDESTRIAN, and THEN tell me you still want a car to run down 12 pedestrians if it comes to that. Now how mr suicidal? Mr psychopath?

Every car right now does NOT put the driver first: it's the driver itself that impulsively puts him/herself first. If you assume that a self-driving-car outperforms you as a driver (and soon, it will), then it's also time to put some limits on the

If it's not able to do that, however, and it is able to [take into account morality], then it should also take into account morality --yet another point you carefully avoid-- and minimize the loss of human lives. Except that you can't program morality. Who do you think you are to imply that my life is worth more less that the pedestrian? Or a dozen of them?

Who are you to think that your life is worth than that of a dozen others? What delusions of grandeur are you suffering from?

You're clearly not getting anywhere. I'll let you have the last word, but as not to be tempted to wipe the floor with your arguments once again, wasting more of my time, I won't read it. But you feel free to think that I will.

/r/dataisbeautiful Thread Parent Link - moralmachine.mit.edu