I watched a great video this morning exploring the moral dilemmas of self driving cars and the important decisions that need to be programmed into their software that decide who lives and who dies if a fatal accident should occur.
Let’s say at some point in the not so distant future, you’re barreling down the highway in your self-driving car, and you find yourself boxed in on all sides by other cars. Suddenly, a large, heavy object falls off the truck in front of you. Your car can’t stop in time to avoid the collision, so it needs to make a decision:
1. Go straight and hit the object
2. Swerve left into an SUV
3. Swerve right into a motorcycle.
Should it prioritize your safety by hitting the motorcycle, minimize danger to others by not swerving, even if it means hitting the large object and sacrificing your life, or take the middle ground by hitting the SUV, which has a high passenger safety rating? So what should the self-driving car do?