Self-driving or autonomous vehicles look like they’re here to stay. We will probably see more and more of them in the future, and we may soon be sharing our roads with them in California. The notion of having a computer program driving us around town is pretty interesting, and it also brings up some serious morality concerns.
Imagine you’re driving through Manhattan Beach one afternoon, and three jaywalkers jump in your way. If you’re fast enough, you can swerve to miss hitting them, but the only place you can steer your car to has an elderly woman walking her dog.
What do you do? Do you save the lives of three unlawful jaywalkers, or do you save the lives of one elderly woman and her dog? Because most drivers are operating on instinct in these moments, they might not have enough time to make such a decision. Their instincts and the speed of their reactions would probably dictate whether the woman and her dog got hit or the three jaywalkers.
Autonomous cars could be preprogrammed to make such decisions
A morality debate quickly ensues when you consider self-driving cars and how these vehicles should treat a situation like this? Perhaps the autonomous car’s logic center would assess the situation and determine that a greater good, and more human lives could be saved, by hitting the woman and her dog. Alternatively, perhaps the car would decide to strike the jaywalkers since the elderly woman is the one who is following the law and where she is supposed to be.
When engineers program autonomous cars, they will need to instruct vehicles what to do in these situations. After all, a computer driven vehicle can think and react fast enough to follow a specific course of action in a circumstance like the one described above.
Perhaps it’s not such a big problem after all
If you think about how many people die each year because of negligent, inattentive, distracted, drunk or unlawful motorists, autonomous vehicles will do an incredible amount of good to save tens of thousands — if not hundreds of thousands — of human lives every year.
Many analysts believe that the benefit of saving this many lives outweighs the downside of a rare forced-choice scenario occurring.
Nevertheless, the question remains, if an autonomous car causes an accident, who is at fault and liable for the crash? Would it be the owner of the self-driving car, the creator of the self-driving software or the vehicle designer and manufacturer? It may take time before personal injury law catches up to these new questions.
Would you ride in or own an autonomous car?
It certainly appears that autonomous cars are the way of the future, but experts say we are 25 to 40 years away from seeing them clogging our streets. In the meantime, drivers are encouraged to drive defensively, to follow the law and to avoid getting into a serious car crash through every means possible.