Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Genuine question: is Tesla’s autopilot crashing more often or more severely than human drivers?


To some degree, that doesn't matter. An underlying feature of a competent approach to safety in design is that the design must take maximal ownership of eliminating risk to all people in all scenarios that can be reasonably expected to result from the design.

The moment Telsa set expectations by proclaiming it as autopilot, they took the corresponding responsiblity to make sure it did not generate any scenarios which were unsafe. The moment they implemented features that allowed the attention of drivers to drift more than standard driving, they also took responsibility to make sure that the drifting attention of drivers did not place the system in an unsafe state.

This same issue applies to touch-screen interfaces in modern cars. Drivers could always stare down at their radio when there were tactile knobs and dials, but touch-screen interfaces now expect that because they've eliminated tactile feedback. Telling drivers 'just don't look down' misses the point, because it's the responisbility of the car manufacturer to not create a system where that added safety risk is not controlled appropriately.


Pretty much this. They could have called it Super Cruise Control or something and I'm pretty sure nobody would have anything to say, because it is expected that cruise control be supervised, but, I think people wouldn't be quite as willing to pay a lot of money for a feature that didn't sound so remarkable.


Self driving technology did seem to reach human parity in 5 years back in 2010s, and the growth was later revealed to be logarithmic than exponential, and Elon doubled down on a bad bet on it.

It’s not about whether they should have clarified the scope, the scope did include a completely automatic driving. It just that they failed to deliver(tbf no one truly made it).


I really don’t see how anything could matter more than “does it save lives, on balance”. If it saves thousands of lives annually, then why would we let tenuous marketing grievances forestall its deployment? How many lives should we sacrifice over branding concerns? Of course, if the technology doesn’t save lives on balance, then that’s reason enough to restrict deployment, but in any case marketing issues don’t seem like they should factor into the calculus.


There was a post recently with a video of a Tesla attempting to drive into the path of an oncoming train: https://youtube.com/watch?v=yxX4tDkSc_g

It’s not the big mistake at the end that stands out to me but the sheer volume of mistakes it makes along the way. Edging forward at an intersection when there’s a red light for example.

I don’t doubt that self driving tech will improve and be a safer alternative to a human driver eventually. It doesn’t seem like we’re there yet though.


Yeah, I fully expect Autopilot to have different failure modes than human drivers, but what I’m interested in is the different fatality rates (deaths per hundred million miles, adjusted for different types of roads i.e., highway vs city streets). If Autopilot can save hundreds of lives annually to human-error mistakes like falling asleep at the wheel, etc but at the cost of one life annually due to obscure failure modes like driving toward a train, I maintain that we should not only allow Autopilot, but probably even mandate it on new vehicles. Sacrificing hundreds or thousands of lives annually because we don’t like the specific failure modes seems absurd. Of course, if it doesn’t save lives, then we should block its deployment on those grounds (but the particular kind of failure mode shouldn’t affect the calculus).


Tesla publishes a Safety Report regarding this.

https://www.tesla.com/VehicleSafetyReport

Many, if not all publicized "Autopilot suspected" Tesla crashes were later found to happen because driver accelerated too fast, forgot that break pedal exists and lost control.

That is less possible with Autopilot, as it can't go faster than 90 mph




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: