Who lives? Who dies? You decide. Moral dilemmas of self-driving cars in an interactive format from MIT.

By your logic if you're a law abiding pedestrian there should be no way in which this happens. Unless of course we're talking about realistic circumstances where multiple factors play a role and unless your self driving car has capabilities bordering on omniscience you're going to have to rely on some basic collision avoidance logic. In most circumstances avoiding harm to the driver results in also avoiding harm to bystanders, because the central idea is to avoid collisions. In cases these are unavoidable, you can assume a self driving car to be law abiding versus whatever has led to the circumstance if such a decision having to be made. So in such a case it makes sense to protect the driver, because they are actually the ones with control removed. Our hypothetical pedestrian has had to have an accident, broken the law or had been pushed into the circumstance by someone, and in all those cases they clearly should be carrying the consequences. Whatever way you think about there's no philosophical conundrum here. Protect the driver, anything else is asking for exploitation of the system and systemic positive feedback loops.

/r/entp Thread Parent Link - moralmachine.mit.edu