Microsoft’s facial recognition service now less bad for nonwhites
Microsoft has improved its facial recognition system to make it much better at recognizing people who aren't white and aren't male. The company says that the changes it has made have reduced error rates for those with darker skin by up to 20 times and for women of all skin colors by nine times. As a result, the company says that accuracy differences between the various demographics are significantly reduced.
Microsoft's face service can look at photographs of people and make inferences about their age, gender, emotion, and various other features; it can also be used to find people who look similar to a given face or identify a new photograph against a known list of people. It was found that the system was better at recognizing the gender of white faces, and more generally, it was best at recognizing features of white men and worst with dark-skinned women. This isn't unique to Microsoft's system, either; in 2015, Google's Photos app classified black people as gorillas.
Machine-learning systems are trained by feeding a load of pre-classified data into a neural network of some kind. This data has known properties—this is a white man, this is a black woman, and so on—and the network learns how to identify those properties. Once trained, the neural net can then be used to classify images it has never previously seen.