> roads and driving laws are all built around human visual processing.
And people die all the time.
> The recent example of a power outage in SF where lidar powered Waymo’s all stopped working when the traffic lights were out and Tesla self driving continued operating normally makes a good case for the approach.
Huh? Waymo is responsible for injury, so all their cars called home at the same time DOS themselves rather than kill someone.
Tesla makes no responsibility and does nothing.
I can’t see the logic the brings vision only as having anything to do lights out. At all.
Yes... but people can only focus on one thing at a time. We don't have 360 vision. We have blind spots! We don't even know the exact speed of our car without looking away from the road momentarily! Vision based cars obviously don't have these issues. Just because some cars are 100% vision doesn't mean that it has to share all of the faults we have when driving.
That's not me in favour of one vs the other. I'm ambivalent and don't actually care. They can clearly both work.
They do, but the rate is extremely low compared to the volume of drivers.
In 2024 in the US there were about 240 million licensed drivers and an estimated 39,345 fatalities, which is 0.016% of licensed drivers. Every single fatality is awful but the inverse of that number means that 99.984% of drivers were relatively safe in 2024.
Tesla provided statistics on the improvements from their safety features compared to the active population (https://www.tesla.com/fsd/safety) and the numbers are pretty dramatic.
Miles driven before a major collision
699,000 - US Average
972,000 - Tesla average (no safety features enabled)
2.3 million - Tesla (active safety features, manually driven)
5.1 million - Tesla FSD (supervised)
It's taking something that's already relatively safe and making it approximately 5-7 times safer using visual processing alone.
Maybe lidar can make it even better, but there's every reason to tout the success of what's in place so far.
No, you're making the mistake of taking Tesla's stats as comparable, which they are not.
Comparing the subsets of driving on only the roads where FSD is available, active, and has not or did not turn itself off because of weather, road, traffic or any other conditions" versus "all drivers, all vehicles, all roads, all weather, all traffic, all conditions?
Or the accident stats that don't count an accident any collision without airbag deployment, regardless of injuries? Including accidents that were sufficiently serious that airbags could not or were unable to deploy?
The stats on the site break it into major and minor collisions. You can see the above link.
I have no doubt that there are ways to take issue with the stats. I'm sure we could look at accidents from 11pm - 6am compared to the volume of drivers on the road as well.
And people die all the time.
> The recent example of a power outage in SF where lidar powered Waymo’s all stopped working when the traffic lights were out and Tesla self driving continued operating normally makes a good case for the approach.
Huh? Waymo is responsible for injury, so all their cars called home at the same time DOS themselves rather than kill someone.
Tesla makes no responsibility and does nothing.
I can’t see the logic the brings vision only as having anything to do lights out. At all.