Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Evaluating technical capabilities in autonomy based on youtube demo videos is like evaluating the military potential by watching army parades on holiday.

The truth is, none of these companies is any close to full autonomy. Full autonomy _testing_ will begin once they deploy cars without backup drivers, to actually see how bad can things go without watchful human supervision.



Waymo had <100 situations where they required human involvement in tens of million miles they drove so far. They are probably already safer than human drivers and would likely cause fewer accidents. The issue there is a legal one - who is going to be responsible for a crash? The company that produced the driving software/hardware, or the owner of the car? Or even the passengers in case there will be city-wide car pools one could call when needed?


No. They had plenty more. Approximately 100 cases over 600k miles last year (their own data from California) which would have lead to a dangerous situation, and 9 cases which by their own admission would have resulted with a collision. Now it is hard to compare crashes of different magnitude, people often are engaged in fender benders, but only on average after 100 million miles driven there is a fatal accident, who knows how many of these 9 cases would have ended up in a fatality. But nevertheless, this is only 600k miles not 100m. Stop spreading misinformation.


Alright, I might have misremembered as last time I've read about it I was surprised how low those numbers were. Maybe I need to look at it again.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: