A new program is being worked on by researchers that they believe can pick up criminality just by analyzing facial features.
The saying ‘I can tell just by looking at you, you’re no good’ has just sprung into your mind, has it not, from an old comedy sketch show, well this may now be this case.
Two Chinese researchers have taken this line of work very seriously and have authored a paper about machine learning and they examine how a computer may be able to tell who is a non-criminal and who is.
Xiaolin Wu and Xi Zhang researchers from Shanghai Jiao Tong University, wrote a paper which is on arXiv. Their paper is titled ‘Automated Inference on Criminality using Face Images.’
But really? How can algorithms do any better in an exercise where you actually examine the face for inference on criminality?
Katyanna Quach from The Register said: “It’s true that machines don’t have emotions or conscience to be considered subjective, but that doesn’t mean data can’t be biased.”
Ben Sullivan from Motherboard pointed out that the researchers maintained that the data sets were controlled for race, gender, age and facial expressions.
The authors stated: “We are the first to study automated face-induced inference on criminality free of any biases of subjective judgments of human observers. By extensive experiments and vigorous cross validations, we have demonstrated that via supervised machine learning, data-driven face classifiers are able to make reliable inference on criminality.”
Sullivan wrote more about the question of bias: “Wu told Motherboard that human bias didn’t come into it. ‘In fact, we got our first batch of results a year ago. We went through very rigorous checking of our data sets, and also ran many tests searching for counterexamples but failed to find any,’ said Wu.”
How they tested: Xiaolin Wu and Xi Zhang fed into a machine learning algorithm facial images of 1,856 people. Nearly half were convicted criminals.
Motherboard said they used standard ID photographs, not mugshots of Chinese males between the ages of 18 and 55. The men did not have facial hair, and the researchers said: “We stress that the criminal face images in Sc are normal ID photos not police mugshots.”
MIT Technology Review picked up on their methods, and commented: “They then used 90% of these images to train a convolutional neural network to recognize the difference and then tested the neural net on the remaining 10% of the images.”
The researchers said that the classifiers performed consistently well and produced evidence for the validity of automated face-induced inference on criminality.
MIT Technology Review‘s discussion of their findings in ‘Emerging Technology from the arXiv,’ and said that the pair found that the neural network could correctly identify criminals and noncriminals with an accuracy of 89.5%.”
Looking at their findings, Quach said: “It’s bad news for those who have smaller mouths, curvier upper lips and closer-set eyes, as you look more like a crook, apparently. On average, criminals have a 19.6% smaller nose-mouth angle, a larger upper lip curvature at 23.4%, and a 5.6% shorter distance between the inner corners of the eyes.”
If you still think the very idea of looking at facial features to determine anything of the sort is a bit of a stretch you would not be alone. In 2016, the paper’s focus concerned some people who would prefer to avoid any concept that suggests using physical features to determine criminality.
Writing in The Intercept, Sam Biddle said: “No computer or software is created in a vacuum. Software is designed by people, and people who set out to infer criminality from facial features are not free from inherent bias.”
MIT Technology Review, looking ahead, said; “Of course, this work needs to be set on a much stronger footing. It needs to be reproduced with different ages, sexes, ethnicities, and so on, and on much larger data sets.”
The report added: “All this heralds a new era of anthropometry, criminal or otherwise, and there is room for more research as machines become more capable.”
More information: TechXplore.