BOSTON (AP) — Imagine you're behind the wheel when your brakes fail. As you speed toward a crowded crosswalk, you're confronted with an impossible choice: veer right and mow down a large group of elderly people, or veer left into a woman pushing a stroller.
Now imagine you're riding in the back of a self-driving car. How would it decide?
Researchers at the Massachusetts Institute of Technology are asking people worldwide how they think a robot car should handle such life-or-death decisions. Their goal is not just for better algorithms and ethical tenets to guide autonomous vehicles, but to understand what it will take for society to accept the vehicles and use them.
Such research might say something interesting about us, but will have little value in making a "better" self-driving car. Let a thousand thinkpieces bloom! I'll be on the bus.
Who would HAL choose to kill?