CEO of Roblox Says Child Predators on the Platform Are an “Opportunity”

Roblox CEO and co-founder David Baszucki says his game's longstanding issues with child predators are an opportunity for growth.
Illustration by Tag Hartman-Simkins / Futurism. Source: Jerod Harris / Getty Images for Vox Media

Though the children’s gaming platform Roblox has notched over 200 million daily users at its peak, they’re not all there for fun and games. Active since 2006, the extremely popular online game has become a haven for bad actors, like racketeers running online casinos for underaged users, or pedophiles prowling for victims.

It’s an issue Roblox has been keenly aware of, as 20 federal lawsuits against the company bring children’s safety on the platform to national attention. In response to the intense backlash, the company has rolled out a facial recognition system meant to gate users of similar ages in with each other. That system opens a whole other can of worms, but now it’s comments made by Roblox CEO and co-founder David Baszucki that are raising the most immediate eyebrows.

In a lengthy new interview on the New York Times‘ “Hard Fork” podcast, Baszucki didn’t downplay the issue of child exploitation as some might expect. Instead, he bafflingly framed it as an “opportunity.

Asked by co-host Casey Newton how he thinks about the issue of child predators on his platform, the Roblox CEO said the issue wasn’t just a “problem, but an opportunity as well.”

“How do we allow young people to build, communicate and hang out together?” he mused. “How do we build the future of communication at the same time? So we, you know, we’ve been, I think in a good way, working on this ever since we started… and so fast-forward to where we are today, it’s just like every week, what is the latest tech? At the scale we’re at, 150 million daily actives, 11 billion hours a month, like what is the best way to keep pushing this forward?”

Later on, when the hosts asked Baszucki about challenges of handling user safety on such an astronomical scale, the CEO again presented the issue as a business opportunity — specifically, an opportunity to carve a monopoly out the online gaming scene.

“I think we actually see there being an incredible opportunity,” Baszucki replied. “Like, the gaming space, in a way, is $190 billion. So, we now have about three or four percent of that — I guess three percent — coming through Roblox. So I would say we like the scale: it creates an opportunity for individual game creators who might be making their game by themselves, without part of systems like that, to make it as part of a more overall system.”

That “overall system,” of course, is Roblox, with all its baked-in chat features and historically poor user moderation.

The Hard Fork hosts went on to grill Baszucki on a 2024 report by short seller and activist firm Hindenburg Research, titled “Roblox: Inflated Key Metrics For Wall Street And A Pedophile Hellscape For Kids.” As the colorful title outlines, the report’s thesis was that Roblox was lowering its spending on user safety in order to report growth to investors.

“First off, Hindenburg is no longer in existence, correct? So, you should report on that,” Baszucki dodged, referencing the firm’s voluntary closure earlier this year. “They went out of business for some reason.”

“So it’s really interesting, because I think we’re diving into a situation where we’re getting better, better, better,” the Roblox CEO continued. “But would you ask the same situation of someone who converted from maybe hyper-manual labor making cars by hand to an assembly line?”

In other words, Baszucki claims that the reason for the dip in safety spending is directly linked to the adoption of AI systems. One of those moderation systems, announced over the summer, was supposedly designed to detect “early signs” of child endangerment. However, under Baszucki’s social media posts about the tool, you can find dozens of complaints that the AI is failing to stop harmful content.

“MeepCity [a game on Roblox]… has seen extreme amounts of inappropriate content in it for years,” one user warned. “It does not appeared that effective action to stop violative behavior has ever been taken. Is AI moderation incapable of detecting this kind of content or is it being allowed?”

It’s too soon to tell whether the new facial recognition software will be enough to stop sexual predators from abusing the platform — but considering Roblox’s long history of ineffective stop gaps, nevermind its CEO’s combative crash out on Hard Fork, we’re not holding our breaths.

More on content moderation: If You Thought Facebook Was Toxic Already, Now It’s Replacing Its Human Moderators with AI

I’m a tech and transit correspondent for Futurism, where my beat includes transportation, infrastructure, and the role of emerging technologies in governance, surveillance, and labor.


TAGS IN THIS STORY

Go to Source