News culture An artificial intelligence is banned from Twitch for transphobic remarks
Published on 06/02/2023 at 18:58
Share :
Barring a huge surprise, artificial intelligences will have an important role in the future. If these technologies have been in development for years, 2023 marks an acceleration of this process (as we have seen with ChatGPT, MidJourney, etc.) with the general public and there is a strong probability that candidates will be jostling at the gate in the coming months. Unfortunately, as is often the case, technological progress will be accompanied by drifts and viewers of the Seinfeld-like show on Twitch have seen it.
For those who do not know the show (which would not be a surprise, this one being above all intended for an English-speaking audience), it is a program generated by an AI and which reproduces a show of stand-up. Nothing Forever, that’s its name, is a real hit and it only takes a few messages on the Reddit platform for its audience to explode. Inspired by the legendary sitcom Seinfeld, the show was notable for commentary from a virtual Jerry Seinfeld that didn’t get through to many viewers.
In the process, what had to happen happened: Twitch decided to ban the program altogether!
Comments that lead to the banishment of Nothing Forever on Twitch
During a skit, 3D model Jerry Seinfeld made a series of comments that didn’t go over at all and were deemed transphobic:
I’m thinking of doing an article on the fact that being transphobic is actually a mental illness. Or that all liberals are secretly gay and want to impose their will on everyone. Or something about how transgender people ruin the social fabric.
The sequence outraged many viewers and resulted in few laughs. The AI concluded by:
But no one is laughing, so I’m going to stop. Thanks for coming tonight. See you next time. Where has everyone gone?
For Twitch, this slippage warranted a two-week temporary suspension (for now). The authorities of the streaming platform considered that it could not afford such remarks and one suspects that it will act in the same way – even more harshly – if it were to happen again. In the wake of this broadcast, the developers of Nothing Forever hammered home that the words of the AI did not reflect their opinion and that these slippages were caused by AI problems.
Asked by the Vice site, they explained that the change of AI model had negative repercussions and it goes without saying that they will do everything to avoid future setbacks.
We started having a crash using OpenAI’s Davinci GPT-3 model, which was causing erratic behavior during the show. OpenAI has a less sophisticated model, Curie, which was Davinci’s predecessor. When Davinci broke down, we switched to Curie to try to keep the show running smoothly. The move to Curie is the source of the inappropriate text that was generated.
This example demonstrates once again that AI can lead to abuses. The spectators of Nothing Forever, meanwhile, will have to wait and hope that their program will return after the two weeks of suspension. Because for now, nothing is decided.