Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the scary parts come when we have to define the "max risk" threshold of the self-driving car.

No matter how good a self-driving car is at estimating risk, it is only able to do that: estimate risk. It cannot know, beforehand, if an accident will happen. So the question becomes: if the car estimates a 0.001% risk of accidents happening at 40 mph on a certain stretch of road, and a 0.01% risk of an accident happening at 60 mph, how fast do we drive?

Or do we just make the person who sets the risk tolerance of the self-driving car liable for any accidents that might happen, and let them set whichever risk tolerance they prefer?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: