It’s Now Illegal to Post Fake AI-Generated Product Reviews by People Who Don’t Exist

It’s official.

Cleaning Up

It’s officially illegal to publish fake, AI-generated product reviews.

Sweeping changes to Federal Trade Commission (FTC) guidelines aimed at cleaning up the polluted, confusing world of online product reviews went into effect on Monday, meaning the federal agency is now allowed to levy civil penalties against bad actors who knowingly post product reviews and testimonials deemed misleading to American consumers.

The new guidelines are expansive, prohibiting sleazy businesses from engaging in a wide array of abusive tactics. That list includes using generative AI tools to whip up fake testimonials or product review articles — bonus points if those reviews are attributed to someone who isn’t real, or published by someone overstating or misreporting their level of experience with a given product.

A perfect example of this kind of content? Review-style articles published at dozens of media companies including Sports Illustrated and The Miami Herald by a third-party media company called AdVon Commerce, which multiple Futurism investigations revealed to be largely AI-generated and even bylined by fake authors outfitted with equally fake profile pictures and bios purporting alleged expertise.

The FTC’s new regulations address “reviews and testimonials that misrepresent that they are by someone who does not exist, such as AI-generated fake reviews,” reads the ruling, “or who did not have actual experience with the business or its products or services, or that misrepresent the experience of the person giving it.”

Buying Stars

The FTC’s new policies also allow it to go after people or companies that purchase phony positive or negative reviews, buy up social media followers, or use intimidation — groundless legal threats, threatening language, threats of physical violence, and so on — to dissuade consumers from leaving critical ratings.

In somewhat muddier waters, the ruling also forbids companies from soliciting “insider” reviews from folks with undisclosed material connections to a product or business. It also makes it illegal for a website to publish review content under the guise of editorial independence, when in fact it’s reviewing products that it has some kind of material interest in.

One example of that latter abuse would be, say, a tire company operating a tire review website without disclosures. This is also a place where that same media company, AdVon Commerce, might breach the FTC’s new guidelines: as Futurism first reported, AdVon also operates a second company, Seller Rocket, which places pay-to-play products in AdVon-produced product reviews without disclosure to readers.

“Fake reviews not only waste people’s time and money,” FTC Chair Lina Khan said in a statement when the ruling was first announced, “but also pollute the marketplace and divert business away from honest competitors.”

More on fake AI product reviews: Meet AdVon, the AI-Powered Content Monster Infecting the Media Industry

Share This Article

Go to Source