Startup Unstable Diffusion has raised more than $56,000 to train porn-generating artificial intelligence with the help of 867 sponsors. But suddenly Kickstarter changed its mind about what AI-based projects could be allowed on the platform and shut down the Unstable Diffusion campaign. Because Kickstarter is an all-or-nothing model and the campaign is not yet complete, all money raised will be returned to backers.
“Over the past few days, we’ve reached out to our community advisory board and read your feedback from our team and social media,” CEO Everett Taylor said in a blog post. “One thing is clear: Kickstarter must and always will be on the side of creative work and the people behind that work. We’re here to help creative work flourish.”
Kickstarter’s new approach to hosting AI projects is deliberately vague:
“This technology is really new and we don’t have all the answers,” Taylor wrote. “The decisions we make now may be different from those we make in the future, so we want this to be an ongoing conversation with all of you.”
The platform is now looking at how designs interact with copyrighted material, especially when artists’ work appears in the algorithm’s training data without consent. Kickstarter also plans to determine if the project will “exploit a particular community or put anyone at risk of harm.”
Course
WOMEN IN LEADERSHIP
Learn how to maintain a work-life balance from a top manager with experience at NPR, Microsoft, IBM and Amazon Alexa.
REGISTER!
In recent months, tools like OpenAI’s ChatGPT and Stability AI’s Stable Diffusion have enjoyed mainstream success, bringing conversations about the ethics of AI artwork to the forefront of public debate.
Some artists have reached out to Kickstarter via Twitter to force the platform to abandon the Unstable Diffusion project – alleged AI art generators could threaten artists’ careers. The community was alerted by the advertising text of Unstable Diffusion:
“With $25,000 in funding, we can afford to train a new model [ИИ] using 75M high-quality images, including ~25M anime and cosplay images, ~25M art images from Artstation, DeviantArt and Behance, and ~25M photos.”
Many artists post their work on these sites without giving express consent to such use of their work. But despite the Kickstarter suspension, Unstable Diffusion continues to raise funds elsewhere.
“While Kickstarter’s capitulation to a vocal group of artists disappoints us, we and our supporters will not back down from defending artistic freedom,” said Armand Chaudhry, CEO of Unstable Diffusion. “We have updated our website so that our supporters can directly participate in the creation and release of new artificial intelligence art systems that are more powerful than ever. We are responding to the call to defend ourselves against artists who advocate making all AI art illegal, sponsorship support will allow us to challenge this increasingly well-funded and organized lobby.”
Unstable Diffusion now processes donations directly on its website using Stripe. Over $15,000 has been raised so far.
In a longer message posted to the Unstable Diffusion Discord community of over 97,000 members, Chaudhry warned members of the growing movement of anti-AI artists:
“It looks like the anti-AI mob is trying to silence us and destroy our community by sending false reports on Kickstarter, Patreon and Discord. They even launched a GoFundMe campaign raising over $150,000 to lobby for AI art to be illegal.”
“Unfortunately, we have seen other communities and companies shrink in the face of these attacks. Zeipher has put it on hold and closed its community, and Stability AI is now removing artists from Stable Diffusion 3.0. But we will not be silent. We will not let them succeed in their attempts to stifle our creativity and innovation. Our community is strong (nearly 100,000 users) and we won’t lose to a small group of people who are too afraid to use new tools and technologies.”
Stable Diffusion uses a dataset of 2.3 billion images to train its text-to-image generator. But only roughly 2.9% of the dataset contains NSFW material, leaving the model with little power when it comes to specific content. This is where Unstable Diffusion comes in. The project, which is part of Equilibrium AI, is collecting data with the help of volunteers from its Discord server to develop finer AI tuning.
In its now suspended Kickstarter fundraiser, Unstable Diffusion said it will work to create an AI art model that can “handle human anatomy better, generate in diverse and controlled art styles, more fairly represent under-trained content generation trends such as LGBTQ, race, and genders”.
In addition, there is no objective and accurate way to check whether the majority of the regular pornographic content freely available on the Internet was created by mutual consent. Also, even if a model agrees to appear in porn, this does not automatically mean consent to the use of her images for AI training. While this technology can create breathtakingly realistic views, it also means it can be used as a weapon to create deepfake pornography without consent.
Artwork created by AI Midjourney wins US competition – artists furious
Source: TechCrunch