A few minutes ago I saw in the BBC news that a Tesla car caused an accident in auto-pilot mode and the driver died.
Trying to find some further details about this, I had a look at the Tesla website and found this post in their blog. It says that the self-driving technology is in beta, it is mentioned in the documentation and the driver has to keep his/her hands on the steering wheel.
Is this accident a failure of technology? No, I don’t think it is. A system is designed by humans and humans are not perfect. Therefore we cannot have a perfect system. If an automated system makes less mistakes than a human then it’s already better/safer/successful. Obviously, we work on getting the error rate as low as possible.
The only difference between human and machine is that when you make a mistake with your car you know what you have done wrong a few milliseconds later and you know what you should had done to avoid it. If it’s a machine that is making the mistake you think that with your experience and knowledge you might had avoided it. So, you regret giving control to a machine.