The AI black box

Discrimination by algorithm: scientists devise test to detect AI bias

There was the voice recognition software that struggled to understand women, the crime prediction algorithm that targeted black neighbourhoods and the online ad platform which was more likely to show men highly paid executive jobs. Concerns have been growing about AI’s so-called “white guy problem” and now scientists have devised a way to test whether an algorithm is introducing gender or racial biases into decision-making. Mortiz Hardt, a senior research scientist at Google and a co-author of the paper, said: “Decisions based on machine learning can be both incredibly useful and have a profound impact on our lives … Despite the need, a vetted methodology in machine learning for preventing this kind of discrimination based on sensitive attributes has been lacking.” ….[READ]