A friend sent me a link to a New Yorker piece–link below–that pointed out that the self-driving cars that Google is developing will sometimes have to make "moral" decisions. The author, Gary Marcus, provides this example: "Your car is speeding along a bridge at fifty miles per hour when an errant school bus carrying forty innocent children crosses its path." Should you swerve, with the expectation that your car will fly off the bridge and you will die, or simply slam on the brakes with the expectation that you will hit the bus fast enough to kill many children (you being protected by your airbag)?
Marcus points out that the computers that control cars will have to make such judgment calls in a split second. My concern is: how should they do it? In particular, whose perspective should they take on?
One perspective is that of you, the driver. It seems to me that you are not required to turn your car if you expect to die as a result. It's not your fault that the bus cut in front of you, and I'll suppose that going 50 mph is within the speed limit. It would be heroic of you to sacrifice yourself for the children, but it's beyond the call of duty. I will suppose that you would not do it.
The other perspective is that of a neutral party (of course there's also the perspective of the children and their loved ones, but it's hard to see why the computer would take their perspective). I think it would be permissible, and that there is positive moral reason, for someone who had the power to flip a switch and cause your car to swerve off to the side in order to save some number of children. You and your car constitute an innocent threat to the children, but I think innocent threats can be killed to save a greater number of innocent victims. I will suppose that a neutral party would divert your car, thereby killing you, to save them.
Should your car take your perspective, as though it is your agent? Or should it take the neutral perspective, as we would want state installed machines to be programmed if they could intervene in such situations? I can see reasons on both sides, but I'd love to hear what thoughts those of you who read this post have.
Thanks. Alec
It should plow forward solely on pragmatic grounds — otherwise nobody would ever buy one.
Here is some layman input.
It should take the drivers perspective. I am convinced that self-driving cars would be involved in fewer accidents. If people were suspicious of self-driving cars, fewer would be sold on average resulting in a higher aggregate accident rate (and thereby more deaths).
In the New Yorker they talk about a future in which all cars (and buses) are self-driving and that they communicate with each other. This scenario significantly reduces the likelihood of a bus cutting in front of you in a way that “startles” the car. It’s harder to come up with examples for this, perhaps a buggy bus, or maybe a sudden wind that puts the bus off balance and it has to compensate. But then again, (ideally) the bus-program should know about the weather in advance and compensate accordingly.
And how will the car know that there are lots of children on the bus? Did the bus tell the car “Move over, I’ve got 40 innocent children!”. How can we trust this information, it could very well be false, and the bus is really the escape vehicle for a murderous bank-robber.
Yes, the idea that they should use the driver’s point of view so that drivers will get them is a good reason to go with the driver’s point of view.
But now suppose they were required, like seat-belts or air bags. Suppose you can no longer buy a person-driven car, or, that you can take off the auto-pilot only on specially designated “human-error-prone” roads.
And suppose that buses and cars do indeed signal how many people are on board, and trucks signal what sorts of dangerous chemicals are on board, so the computers have the information they need to make the decisions.
Now should we use the neutral point of view for them? Note, for each individual, the odds of living longer go up that way. But still, your car might not do what you want it to.