Cops Upload Image of Suspect Generated From DNA, Then Delete After Mass Criticism

An image of a suspect, created from a sample of his DNA? It sounds like dystopian sci-fi, but experts say it’s even worse — it’s tech that cops are already rolling out, even though it almost certainly doesn’t work.

Alberta, Canada’s Edmonton Police Section (EPS) took to Twitter on Tuesday to share an image created using DNA phenotyping, which the department claims can be used as a helpful tool to predict — not actually identify, just predict — what a criminal might look like.

The only problem? The tech is nowhere near mature, to the degree that it’s unlikely that it’s able to generate an accurate portrait, and it can’t make any guess at all regarding key physical details ranging from age to facial hair.

The EPS alluded to those limitations in its press release, but decided to go ahead and release an image of the computer-generated suspect regardless. One key and loaded detail? The AI-generated suspect was Black — and experts were quick to raise the alarm.

“Geneticist here,” tweeted Dr. Adam Rutherford, a genetics lecturer at University College London, in response to the announcement. “You can’t make facial profiles or accurate pigmentation predictions from DNA, and this is dangerous snake oil.”

Rutherford is almost certainly right. The EPS purchased the tech, dubbed Snapshot, from a company called Parabon NanoLabs, which on its website promises to help police departments “solve [their] cases — FAST!” You know, because that’s what everyone wants cops to do — whip up a would-be “criminal” using what’s essentially an Ancestry.com profile, so they can get’er done quick by… what? Investigating anyone who might match that “description?” Got it.

Again, the science just doesn’t check out well enough for any of these generated description to be used in good faith. As Rutherford noted, it’s impossible to accurately gauge most physical characteristics — skin color included — from a phenotype. It flat-out can’t tell you someone’s age or weight, and fully leaves out any physical changes wrought by environmental factors like pollution — which the EPS knows, because it had to use several default settings to generate their person of interest.

Bafflingly, the EPS did acknowledge many of those shortcomings in its announcement.

“Using DNA evidence from this investigation… a ‘Snapshot’ composite was produced depicting what the POI may have looked like at 25 years old and with an average body-mass index (BMI) of 22,” reads the department’s press release. “These default values were used because age and BMI cannot be determined from DNA.”

In other words: this POI might have these characteristics, and though his age is unknown, this is what the maybe-person could have looked like at the age of 25 with a BMI of 22. There’s a hole in the logic at every angle, here.

But they rolled it out anyway. On social media, critiques were searing.

“This is why we want the police defunded,” wrote one Twitter user. “You’re wasting money on racist astrology for cops.”

“I like that you were somehow sold this product,” responded another, “without anyone going ‘wow, that’s an obviously terrible idea’ at any point.”

That anger didn’t fall on deaf ears. After a wave of criticism, the EPS removed the image from its press release and social media, and issued a moderately contrite new statement.

“The potential that a visual profile can provide far too broad a characterization from within a racialized community and in this case, Edmonton’s Black community, was not something I adequately considered,” wrote Enyinnah Okere, the chief operating officer for the Community Safety and Well-Being Bureau of EPS. “There is an important need to balance the potential investigative value of a practice with the all too real risks and unintended consequences to marginalized communities.”

“Any time we use a new technology — especially one that does raise concerns about profiling of a marginalized group — we cannot be careful enough in how we validate these efforts and fully, transparently consider the risks,” he continued. “We have heard legitimate external criticism and we have done our own gut checks internally to determine whether we got the balance right — and, as a leader, I don’t think I did.”

For now, the tech seems like a dead end generator, not to mention a pseudoscientific dragnet that get innocent people caught in the crossfire.

More on cop tech: Man Sues City of Chicago, Claiming Its AI Wrongly Imprisoned Him

Share This Article

Go to Source