Like some kind of ridiculous goody-two-shoes who brags about how well they did on their driving test and the fact that they have never gotten a traffic ticket, autonomous cars are programmed to absolutely always follow the law. Again, one day this means they will be a lot better than people, because they won’t be so distracted by rocking out to Whitesnake that they blow through a stop sign.
But for now, while they are on the roads with cars that still have drivers in them, it means they can’t react to ambiguity. NO ONE follows traffic laws all the time. How many times have you sped up to make a red light, or merged into traffic that was going above the speed limit? You have to be able to balance what is legal with what is safe in the moment. Driverless cars won’t be able to make decisions like that, but are surrounded by people who do, which will lead to accidents. In fact, all those crashes in the studies mentioned in the first point involved some form of human error – usually a regular car hitting an autonomous car – because the human driver expected the self-driving car to act how a person would in that situation, not a computer.