Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>You are assumed to be a criminal because a machine said so.

Not really. At least where I live, the false positive rates are high enough that people would walk right past, or the employees would wave them through because they're holding store branded bags. A more egregious example would be the algorithmic bail systems that some states are deploying.



Bail has been algorithmic for many years, based on previous offenses and seriousness of crime.

The new horror is machine learning algorithms that make decisions based on hidden inappropriate discriminatory factors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: