Can we teach robots ethics?
We are not used to the idea of machines making ethical decisions, but the day when they will routinely do this - by themselves - is fast approaching. So how, asks the BBC's David Edmonds, will we teach them to do the right thing?
The car arrives at your home bang on schedule at 8am to take you to work. You climb into the back seat and remove your electronic reading device from your briefcase to scan the news. There has never been trouble on the journey before: there's usually little congestion. But today something unusual and terrible occurs: two children, wrestling playfully on a grassy bank, roll on to the road in front of you. There's no time to brake. But if the car skidded to the left it would hit an oncoming motorbike.
Neither outcome is good, but which is least bad?
The year is 2027, and there's something else you should know. The car has no driver.
I'm in the passenger seat and Dr Amy Rimmer is sitting behind the steering wheel.
Amy pushes a button on a screen, and, without her touching any more controls, the car drives us smoothly down a road, stopping at a traffic light, before signalling, turning a sharp left, navigating a roundabout and pulling gently into a lay-by.
The journey's nerve-jangling for about five minutes. After that, it already seems humdrum. Amy, a 29-year-old with a Cambridge University PhD, is the lead engineer on the Jaguar Land Rover autonomous car. She is responsible for what the car sensors see, and how the car then responds.
She says that this car, or something similar, will be on our roads in a decade.
Many technical issues still need to be overcome. But one obstacle for the driverless car - which may delay its appearance - is not merely mechanical, or electronic, but moral.
[...]
http://www.bbc.com/news/magazine-41504285