Good riddance.
Cock Sure
An AI-powered app called Calmara promised to offer users a quick and easy way to check for STDs. All you had to do was take a dick pic, send it in, and with the wonders of AI and science, you’d get a diagnosis — “on the spot.”
If all that sounds extremely suspect, that’s because it is. HeHealth, the company behind the app, has just been shut down after the Federal Trade Commission conducted an inquiry, The Verge reports, finding that its app’s “clear, science-backed answers” it supposedly offered customers about their partner’s “sexual health status” were anything but.
“The FTC is so committed to protecting consumers that it is even willing to wade through pages of dick pics to protect Americans from AI scammers,” an anonymous source familiar with the matter told The Verge.
Dick Move
One of HeHealth’s boldest claims that got it caught by the long schlong of the law was that Calmara had up to a 94.4 percent accuracy for detecting over 10 different sexually transmitted infections, among them syphilis, herpes, and HPV, according to a recent FTC letter.
These capabilities were supposedly backed by a study published in a prestigious health journal. But as the FTC discovered, the app makers weren’t being honest about its findings.
For one, four out of five of the study’s authors either worked for HeHealth or were paid consultants.
As for the study itself, not only were some of the images used to train the AI detection model provided by people who never got an actual diagnostic test to confirm their condition, but HeHealth only assessed its app’s capabilities on a “small” sample size, according to the FTC.
Moreover, the study discloses that the AI was only trained and tested on four STIs — not ten, as Calmara claimed.
Members Only
The FTC issued a subpoena called a civil investigative demand to HeHealth citing these findings last month. With the writing on the wall — and the FTC breathing down its neck — HeHealth decided to shut down Calmara by July 15, and said it would delete all customer data received through the app, dick pics included.
It’s been a long time coming, as the app’s shortcomings were clear before the FTC got involved. Whatever glowing press it got early in the year was soon met by a deluge of damning reporting.
Among them, an April investigation by the Los Angeles Times found that the app couldn’t reliably distinguish between real penises and phallic objects, including a penis-shaped cake. It also found that Calmara struggled to identify explicit images of STIs in textbooks.
With such a dubious track record, other facets of the app’s marketing, like calling it “your intimate bestie for unprotected sex,” are even more distressing. As The Verge notes, HeHealth tried to sell the app to women as a way to check their dates — which is a consent nightmare waiting to happen.
No doubt that one of the reasons HeHealth got away with it for as long as it did was by pinning its app’s capabilities on the nebulous powers of an AI model — as good an example as any that we’re in an age of AI quackery.
More on AI: Researchers Call for “Child-Safe AI” After Alexa Tells Little Girl to Stick Penny in Wall Socket
Share This Article