Originally Posted by
John2
Interestingly, self driving cars aren't "programmed" in the traditional sense, they actually use something called machine learning. Much like humans "learn" to drive, machines do too. This doesn't create code that humans can read, it basically creates giant databases that humans cannot really understand, we can just observe the behavior.
It's what makes all these AI generators quite fascinating as we've created something that the top developers can't properly understand.
Self-driving cars are probably already much safer than the average human driver, but as a society, we will be very cautious to introduce them. As you point out it's the unexpected events that haven't had much training that will cause the most delay.
As we transition cars will be able to self drive or human-driven and as a result the insurance will be the responsibility of the car owner.
The idea of owning a car will probably quickly become old-fashioned, the future of car transport will be in uber-like service where the car rapidly shows up when you need it and the company handles the service and insurance.
It will be a long and slow transition, but undoubtedly, even if they are imperfect, self-driving cars will be far far safer than human drivers.
If you want to drive in the future, the car will probably intervene if you look like you're going to crash or violate the speed limit etc. Realistically we'll probably see a rise in recreational tracks that people can go and drive "like in the old days".
It will probably eventually be illegal for a human to drive without this protection, because one day a human driver will accidentally run over a child and the public outcry will argue that there is no reason for unnecessary deaths when we have the technology to prevent them.