Less than a week after the launch of Google’s Project Genie, Roblox is sharing some of its ideas about AI world models.
Less than a week after the launch of Google’s Project Genie, Roblox is sharing some of its ideas about AI world models.


Less than a week after Google showed off Project Genie, a tool powered by an AI world model that lets users generate interactive 3D experiences from prompts, Roblox is sharing some early ideas about how it wants to use AI world models and prompts to let creators generate experiences and change them in real time — something it calls “real-time dreaming.”
Tech companies, including Google, Meta, and xAI, have been increasingly interested as of late in the idea that AI can help users make interactive experiences — even though creators of many kinds are pushing back against AI tools, a growing number of video game developers think generative AI is bad for the gaming industry, and some developers are choosing not to use AI development tools at all. Some companies that make AI tools, including Google, have also been subject to copyright infringement lawsuits.
In a virtual briefing, Roblox SVP of engineering Anupam Singh showed The Verge an example of what real-time dreaming might look like using a prerecorded video. In the short demo, an AI-generated viking-themed world responded in real-time to prompts that added a tsunami wave and then a boat for the viking character to board. You can see what appears to be the same demo I was shown in this video featuring Roblox CEO Dave Baszucki.
Roblox’s real-time dreaming is still in a “research stage,” Karun Channa, senior director of product, tells The Verge, and there’s no timeline for when the tools will be available. I couldn’t try real-time dreaming myself, so I can’t vouch for how well the experience may actually work right now in practice. The experiences I made with Project Genie last week weren’t very good, and even though Roblox’s demo showed off the real-time changes (which Project Genie isn’t capable of), I’m still skeptical that the ability to change the experiences on the fly with prompts will make a meaningful difference in how enjoyable they will be.
”The next frontier of creation on Roblox is the continued AI-driven evolution of our creation platform that will allow creators to generate immersive environments, iterate, debug, and collaborate with their teams all through natural language prompts,” Singh says in a blog post. “If someone can dream it, they should be able to bring it to life.”
But Roblox doesn’t envision these types of AI world model tools completely taking over from traditional game development. Singh tells The Verge that making a game requires a creative mindset, and “we don’t see a model replacing that creative part.” (A Take-Two executive said something similar yesterday on an earnings call.)
Today, Roblox is also launching “4D creation” tools developers can integrate into their games so that players can prompt AI to generate objects they can interact with. Previously, those tools had been in a closed beta, but now they’ll be in open beta, and they’ll let you generate things that you drive, fly, or shoot.
You can get an idea of what developers can do with the tools in an experience called Wish Master, which has already implemented them. To use the new “4D” tools, you have to select them from a model picker menu like what you might find in a more traditional AI chatbot.
After a few minutes of novelty, I didn’t find generating objects in the experience to be very interesting. The world of Wish Master is mostly just a big open space for people to generate things and run around in, so there wasn’t much to do with the objects after I made them.