/
Baggu, the popular maker of reusable bags, dropped a highly anticipated collaboration with a buzzy designer. Then all hell broke loose.
Share this story
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
It all started with a nylon purse shaped like a horse.
Baggu, the wildly popular brand of reusable shopping bags, announced earlier this month that it would be releasing a collaborative collection with New York-based brand Collina Strada. In the past, special-edition designer drops have been successful for Baggu: a previous collab sold out within minutes of items hitting the web. This new collection — with its colorful, dreamlike prints and bags shaped like a pony, little legs and all — seemed designed to elicit that same viral hype. The brands teased designs. Influencers posted unboxing videos. Fans were ready to shop.
But on the day the bags and accessories were set to go on sale, fans got more details about the designs: some of the prints were created using AI image generator Midjourney. On product pages, a short disclaimer was added:
Blue Thorns is an AI-conceptualized print from Collina Strada’s SS24 “Soft is Hard” collection. The team used Midjourney as a tool to remix old Collina prints and drive them further. After they used Midjourney to mix two of their prints together, their graphics team transformed the concept into a repeat, inserting logos and adding new elements and layers to complete the print.
Some fans were not happy, to put it lightly. Comments on Instagram called the use of AI “lame,” “so disappointing,” and “unforgivable.” Some customers say they didn’t realize when they placed an order that AI was involved in the design process. On TikTok, some customers vowed never to purchase from Baggu again.
The most common complaints were around a “lack of transparency” that AI was used. Shoppers, it seems, wanted more of a heads-up or more prominent disclaimers. Others objected to the collaboration on moral grounds, saying AI tools trained on other artists’ work without consent is theft. And finally, the environmental impact of generative AI is also a common concern, perhaps because Baggu touts its eco-friendly brand ethos. Baggu didn’t immediately respond to a request for comment.
There’s some gray area in the response to the collection, too: Collina Strada has used generative AI as a design tool before. The designer behind the brand, Hillary Taymour, has previously discussed her process using tools like Midjourney, describing to The Business of Fashion the iterative process of prompting AI systems over and over with her own work to see what the tool spits out. Perhaps this collaboration suffered from a lack of communication to customers leading up to the rollout — the right framing, after all, goes a long way in marketing.
Beyond the short explanation, the Baggu website offers few details about the process of generating the AI prints. In an email to The Verge, Collina Strada spokesperson Lindsey Solomon noted that only two of the prints used AI — others, like the “Sistine Tomato” print are done by “photograph[ing] every element of the print and compos[ing] them together, hand placing each rhinestone and tomato.” The AI prints, meanwhile, are based on outputs generated by feeding Midjourney images of Collina Strada’s past work, essentially remixing the brand’s own designs. Is it still theft if your inputs are your own work? And what kind of freedom should artists have to experiment with these tools before it’s seen as a moral failing?
We’re in a strange transition phase of AI. Tools like ChatGPT have been around for close to two years, and our online — and offline — spaces are flooded with synthetic content. Sometimes, it’s amusing; other times, the potential for harm and abuse is obvious. That’s why I was surprised when I saw the rapid spread of an AI “All Eyes on Rafah” image that is perhaps the most viral piece of AI media yet — are we okay with AI or not? Who gets to use it, and to what end?
This case of AI-designed reusable bags is far from the most pressing example of the tension between the future tech companies want and what everyone else envisions for our world. But it hints at a debate we’re only going to see more of and raises questions about who owns what, who gets credit, and what is fair. It seems the answer at the moment is: it depends.