But who watches the watchers?
I Spy
Welsh police said they used facial recognition software to scan audience members streaming into a Beyoncé Knowles-Carter concert in May to catch anybody suspected of terrorism or pedophilia, the BBC reports, in a revelation that raises questions about privacy and discrimination with the still-flawed tech.
“It seemed to me entirely sensible,” South Wales Police and Crime Commissioner Alun Michael told Members of Parliament at a hearing, according to the BBC. “That was announced in advance and reported to me, it wasn’t secretive.”
Michael said the police department had a facial recognition van parked outside a concert hall in Cardiff on May 17, with the system comparing attendees’ faces to a watch list of suspected terrorists and pedophiles. Footage, he said, was kept for a maximum of 31 days.
Michael said they had focused on pedophiles because “there would be very large numbers of young girls attending that concert.” As for extremists, police wanted to avoid another terrorist attack like the 2017 Manchester Arena bombing at an Ariana Grande concert, Michael explained to lawmakers.
Today’s Dystopia
Nobody wants terrorists and child predators running amok, but facial recognition tech can backfire badly. A Georgia man says he was wrongly jailed when a law enforcement system confused him with someone else. Owners of Madison Square Garden have even used facial recognition to find lawyers who are suing them while they were attending concerts and kick them out. In another instance of this tech gone amok, housing officials have used it to evict poor people from public housing.
Those incidents and more are ammunition for critics of the tech. They argue that facial recognition is not only an invasion of individual privacy because people are being scanned without their consent, and that the tech can incorrectly identify people in false positives, as well as be used to control and surveil minority groups such as Uighurs in China.
Another important point is that while this tech is being used supposedly for the public good, there’s very little transparency about how it’s being deployed, the data sets involved, and other concerns.
So far, there’s also very little regulation in the space — so forgive us for being skeptical. Case in point? Michael made no mention of actually catching any terrorists or pedophiles at the Beyoncé show.
More on facial recognition: Rage Against the Machine Refuses to Play Venues With Facial Recognition
Share This Article