Mercedes solves car AI morality problem
By Ajith Ram October 19, 2016
- Car AI will prioritise driver over pedestrians
- Laws will vary for AI cars across countries
FROM the start of the self-driving car hype, it has always been a morality question. In case of a situation where an inevitable accident will kill the car driver, pedestrians or other drivers, who will the car's AI choose to save?
See this video below for a clear example of such a scenario.
Car manufacturer Mercedes-Benz has decided that if it comes to a choice between drivers and pedestrians, their cars will always choose to save the driver. Obviously, this does not guarantee that the driver will be saved. But the car AI will at least know whose life to prioritise.
Christoph von Hugo, Mercedes’s manager of driver assistance systems told Car and Driver magazine, “You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save. If you know you can save at least one person, at least save that one. Save the one in the car.”
He is of the opinion that it is not a major issue as an AI controlled car will always avoid crashing in the first place.
“This moral question of whom to save: 99 percent of our engineering work is to prevent these situations from happening at all. We are working so our cars don’t drive into situations where that could happen and will drive away from potential situations where those decisions have to be made,” he said.
It could also be that Mercedes has figured out that a car which prioritises pedestrians is not going to be very popular with its potential customers.
Mercedes is not the only one defining a solution to a complex ethical and morality problem. The German government is also now about to define the laws governing self-driving cars.
In an interview last month, the country's transport minister stated that there will be three rules to govern self-driving cars.
1) Property damage should always be prioritised over personal injury
2) Pedestrians should not be classified based on parameters like age and size
3) If something goes wrong, the car manufacturer will be held responsible
Although they might sound vaguely familiar, these three rules proposed by the German minister are quite different from Isaac Asimov's famous three laws of robotics. Used in movies such as Will Smith's I Robot, these three laws are:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm
2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
Self-driving cars also pose questions for the driver. For instance, is it legal for the driver to sleep while the AI is driving the car?
Or watch a movie as seems to have happened in the recent Tesla car crash.
In the UK, an insurance company had recently unveiled a policy for self-driving cars. This completely prohibits the driver from dozing off while the AI is in action. This is in keeping with current UK laws which require the driver's complete attention at all times. Even perusing a book is not an option in the UK.
As usual, the Americans disagree with the British on the laws governing self-driving cars. Jumping straight into the future, Michigan has recently passed laws which even allow cars without steering wheels. Many of the other states are expected to follow suit.
While the key pieces of self-driving technology will fall into place soon, it is likely to be years before all the ethical questions are fully answered.
Nvidia announces new graphics card
Review: Nvidia GeForce GTX 1060, the new mainstream champion
Nvidia accelerates self-driving vehicle, AI research
GTCx, a dazzling preview of the future
For more technology news and the latest updates, follow us on Twitter, LinkedIn or Like us on Facebook.