HughPickens.com writes: What should a driverless car with one rider do if it is faced with the choice of swerving off the road into a tree or hitting a crowd of 10 pedestrians? The answer depends on whether you are the rider in the car or someone else is, writes Peter Dizikes at MIT News. According to recent research most people prefer autonomous vehicles to minimize casualties in situations of extreme danger -- except for the vehicles they would be riding in. "Most people want to live in in a world where cars will minimize casualties," says Iyad Rahwan. "But everybody wants their own car to protect them at all costs." The result is what the researchers call a "social dilemma," in which people could end up making conditions less safe for everyone by acting in their own self-interest. "If everybody does that, then we would end up in a tragedy whereby the cars will not minimize casualties," says Rahwan. Researchers conducted six surveys, using the online Mechanical Turk public-opinion tool, between June 2015 and November 2015. The results consistently showed that people will take a utilitarian approach to the ethics of autonomous vehicles, one emphasizing the sheer number of lives that could be saved. For instance, 76 percent of respondents believe it is more moral for an autonomous vehicle, should such a circumstance arise, to sacrifice one passenger rather than 10 pedestrians. But the surveys also revealed a lack of enthusiasm for buying or using a driverless car programmed to avoid pedestrians at the expense of its own passengers. "This is a challenge that should be on the mind of carmakers and regulators alike," the researchers write. "For the time being, there seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest."