Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In this incident, the driver was using autopilot in a fashion that it should not be; twisting road at high speed.

I don't understand why, if the car's GPS co-ordinates are known, the car even allows autopilot on anything but roads known to be within spec.

> The driver IS at fault.

I'm not sure legal liability is that clear cut.



I can't imagine that every road in the world can be determined to be "safe" or "not safe" via GPS coordinates alone.


Then the classification would fail-safe (i.e. no autopilot for you) and that's good[1]. The alternative with the current technology is apparently depending on humans to decide (poorly). This is to minimise deaths caused by engaging autopilot at the wrong time while you gather more data.

[1] If the goal is to stay out of court. I understand the AI drivers are better on average argument.


Certainly the software making the driving decision is capable of 'knowing'




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: