Let’s talk about a classic thought experiment in ethics from the 1960s called "the trolley problem."
Here’s the problem: a runaway trolley is barreling down the tracks, on a course to hit five people who are tied to the track. You can pull a lever to make the trolley switch tracks. However, doing so will put it on a course to kill two other people who are tied up on the second track. What do you do?
It’s an interesting moral dilemma that has been debated for the past 40 years. And now, thanks to Google, the trolley problem is getting a modern makeover. Let’s call it the "Google car problem."
Here’s the revised problem, courtesy of Popular Science magazine:
A front tire blows, and your autonomous SUV swerves. But rather than veering left, into the opposing lane of traffic, the robotic vehicle steers right. Brakes engage, the system tries to correct itself, but there's too much momentum. Like a cornball stunt in a bad action movie, you are over the cliff, in free fall.
Your robot, the one you paid good money for, has chosen to kill you.
It’s not just a question for future car owners. Most futurists predict autonomous tractors will be prevalent on tomorrow’s farming operations. How will OEMs respond to this potential situation? Should they install a manual override that allows the driver to take over the brakes, for instance?
In the coming years, as autonomous tractors move from speculative buzzword to having actual, practical uses on the farm, it will be interesting to see what other safety measures might be installed to avoid ethical quandaries like the trolley problem.