How Should Self-Driving Cars Choose Who Not to Kill?
https://medium.com/@MORGANMEAKER/how-should-self-driving-cars-choose-who-not-to-kill-442f2a5a1b59
If an automated car had to choose between crashing into a barrier, killing its three female passengers, or running over one child in the street — which call should it make?
When three U.S.-based researchers started thinking about the future of self-driving cars, they wondered how these vehicles should make the tough ethical decisions that humans usually make instinctively. The idea prompted Jean-François Bonnefon, Azim Shariff, and Iyad Rahwan to design an online quiz called The Moral Machine. Would you run over a man or a woman? An adult or a child? A dog or a human? A group or an individual?