How is being able to determine what racial/ethnic group a person belongs to from raw data regarding their medical history/information wrong in any way?
This is not about equal rights, voting access, or systemic racism. Rather, the health trends that underlay specific groups of people.
Our cultural definitions of race/ethnicity often have little to do with someone's /actual/ genetic ancestry. For example, in the USA, Australian indigenous people, Andamanese islanders, predominantly European mixed Euro-African Americans and many others would all just get, I suspect medically unhelpfully, labelled as black.
If the AI is using this information (probably informed by some biases it's picking up from the training data) then unless those biases are actually correct and justified we're training a system that's going to be using racial data, potentially inappropriately (see examples above and in the article), in rendering medical advice or diagnoses. Now that's a problem.
But your solution is the cause of the problem in the first place. An AI doesn't attribute anything that isn't in the data. Feedback from medical indications is certainly required, the racism would solve itself through that.
This seems suspect; he says "the model can still recognise the racial identity of the patient well past the point that the image is just a grey box". Maybe someone here who actually understands deep learning can shed some light on what's happening here.
It's difficult to be sure. The original study was posted a few days ago, and my impression is that it was over hyped, more overhyped than the usual level. The article claims:
> The paper is extensive, with dozens of experiments replicated at multiple sites around the world.
but it has not been reproduced by independent groups and IIUC it's just a preprint that has not been officially accepted by a serious peer review journal.
About grey box, they are using a high pass filter, and put the zero in grey, something like #808080. With a high pass filter you should be able to see all the borders of the parts of the body, but sometimes the constant is low. So for a person it looks like grey, but a computer can adjust the contrast and see it.
For example, you can go to https://david.li/filtering/ and click the "Edge detection" filter. Now imagine reducing the contrast of it, and using grey instead of black for 0.
This is not about equal rights, voting access, or systemic racism. Rather, the health trends that underlay specific groups of people.