I interviewed the founder of Civitai, an AI platform accused of deepfakes and porn. It highlighted the challenges of open source AI.
Civitai is an example of a startup walking a fine line between the promise of open source generative AI and the downsides of NSFW content at scale
[AN UPDATE: Since publishing this post I’ve gotten some pretty interesting pushback from 404 Media for my willingness to interview Civitai’s CEO after the company reached out to me. To that, I simply say “thanks for reading and sharing.” 🤷♀️]
As a senior writer at VentureBeat, I don’t cover consumer AI products very often, unless they are released by Big Tech. So I was surprised two weeks ago when a PR firm representing year-old startup Civitai, an AI image model-sharing platform that had recently been in the news due to allegations from 404 Media of nonconsensual deepfakes and AI-generated porn on the site, reached out.
The rep offered me an exclusive interview with Civitai founder Justin Maier — saying that Civitai wanted share their side of the story, “including important context and nuance on the evolving role of content moderation in AI.”
With an overflowing inbox of pitches, I was tempted to delete the message — especially when I read the 404 Media stories and it didn’t take long to encounter NSFW (Not Safe for Work) warnings on Civitai’s website. Did I really want to go down this racy rabbit hole?
Still, it couldn’t hurt to just have a chat and find out what this was all about, so I set up a call for the following week — which led to a story I published yesterday on VentureBeat: “Civitai founder champions open source, downplays AI deepfake porn.”
It turned out that Maier’s take on Civitai was fascinating. It seemed like a perfect example of several narratives I’ve been following all year: The debate between open and closed AI; the fine line between rapid innovation at scale and tackling issues around privacy, safety, bias, content moderation and copyright; and the fact that generative AI has powerful use cases that the public wants to use for very different things — both positive and negative.
We talked about how Maier, a Boise, Idaho-based developer, started Civitai as a “passion project” to support an open source community discovering, creating and sharing models and image-generated content based on the popular text-to-image generator Stable Diffusion. Since then, it has exploded from a four-person startup and less than 100,000 users to a 15-person company with $5 million in funding from VC firm Andreessen Horowitz, growing rapidly to 10 million unique visitors each month and millions of uploaded images and models. It certainly sounded overwhelming.
He explained that the vast majority of Civitai users, Maier explained, are simply LoRA model enthusiasts — LoRA models are small, fine-tuned models trained on specific characters or styles — looking to express themselves through AI art generation for everything from fan fiction and anime characters to photorealism and even fashion.
He pointed to a new safety center on Civitai’s website and policies such as Three Strikes and Zero Tolerance for inappropriate content. He emphasized that 404 Media’s accusations were often misleading, using figures from June 2023, for example, when Civitai’s image-generating feature was still in internal testing (the company said it launched in September). Contrary to those figures showing 60% of content on Civitai as NSFW (Not Safe for Work) — a figure derived from 50,000 images — today users on Civitai generate 3 million images daily, and the company says “less than 20% of the posted content is what we would consider ‘PG-13’ or above.”
But about 15 minutes into our call, we got to the fascinating meat of the matter: I pointed out that Maier could have gone in a different direction, offering controls on output posted to Civitai including not allowing NSFW content or deepfakes. Wouldn’t that have been an easier route that would have avoided the current 404 Media controversy?
Maier immediately responded by saying that he had grown up Mormon and was “not a huge fan of pornography and things like that myself.” But it was clear that he took the promise of open source AI very seriously — even pointing to the New Testament’s Parable of the Weeds as an explanation. The parable, related by Jesus in the Book of Matthew, describes how servants eager to pull up weeds were warned that in doing so they would also root out the wheat, so they were told to let both grow together until the harvest.
“People that are there to make these NSFW things are creating and pushing for these models in ways that kind of transcend that use case,” Maier said. “It’s been valuable to have the community even if they’re making things that I’m not interested in, or that I prefer not to have on the site.”
We circled back to this topic several times, but Maier never wavered — he seemed to feel it was essential to both make sure that the small percentage of users creating objectionable content did not disrupt the opportunities for the vast majority using the open source models for good, while doing everything he could to keep the site from going off the rails.
“These bad actors that are such a small portion” of what the site is actually being used for, he told me, “are making us an easy target.”
Maier, presumably like many in the Civitai community, seems to be an active generative AI hobbyist deeply immersed in the world of open source AI models. It was really interesting to find myself talking philosophically with him about how the tech of image generation advances — especially in the realm of anatomy.
For example, it was actually rather wild to chat with Maier about the ins and outs of how people in the community tried to improve anatomical concepts in Stable Diffusion — training models on better faces, eyes, hands or yes, even penises — to achieve models that were better at doing things like human faces, or anime, which were then merged for even more improvement. “This is an open source community of hobbyists who have pushed the technology forward, perhaps even further than Stability [the company behind Stable Diffusion], this company that had hundreds of millions of dollars for the tech,” he told me.
Yes, we chatted about penises: When I asked if by pushing the technology forward Civitai also enables the potential for deep fakes or pornography, Maier said that one challenge is that anatomical concepts can overlap. “If we didn’t capture penises, what else is going to be affected by that?” he said. “How the [model] weights affect each other with this stuff is that by not properly capturing penises means that fingers look funny now.”
Talk about being in the weeds! Ultimately, Maier seemed like a really nice guy — a father of two daughters, one recently diagnosed with Type 1 diabetes (Civitai is currently raising money for juvenile diabetes research). But I can’t imagine controversy around the site dying down, though I can’t imagine Civitai’s growth slowing down either.
The question is, will Maier and his small team be able to handle the speed and scale of what’s coming if they stick to their guns and work to keep a fully open source AI platform safe? He told me many times that they are “doing their best” when it comes to content moderation and safety policies, and I believe that. But it certainly is a clear example of a small startup tackling some of generative AI’s biggest challenges.
He even twists God's word to allow sin on his website. Deceitful. I will be reporting the platform to regulators, hope he gets punished for the content he hosts (sexualized images about children). A pedophile is a top new artist on Civitai and they do nothing to stop this.