What if an AI could produce interactions so real that users would believe they were talking to a real person? Although this is still very far from reality, with the rapid development of technology, we can already glimpse the possibility of it happening.
As is currently happening with OpenAI’s artificial AI, ChatGPT-4o, where OpenAI is concerned that users of this artificial intelligence will have emotional affection for the AI in question.
OpenAI Asks ChatGPT-4o Users Not to Get Emotional When Using AI
Don’t get carried away
Love is indeed blind, the target could be a fellow human or even a chatbot. OpenAI revealed that on several occasions they found an emotional connection between the user and the program.
According to OpenAI, some users even treated the program as if it were their partner, with one saying “this is our last time together,” which prompted an investigation by OpenAI.
Human relationship with AI
From there, concerns arise that using AI with emotional connections like this could have an impact on norms and society. Not only that, it is feared that something similar could cause a rift in interactions between humans. For example, lonely humans might prefer to spend time with AI, and reduce their time interacting with humans.
It Would Be Dangerous If AI ‘Hallucinated’
ChatGPT-4o can interact like a human
Another, more frightening, impact is that there is a possibility of various conditions where users will obey whatever the AI informs them even though the program is ‘hallucinating’.
Of course this has a very broad impact, ranging from mild to life-threatening. That’s why OpenAI will continue to monitor how users interact with AI and improve its system over time. So it reminds me of the movie Her, brott.
Get cool information on Gamebrott related to Tech or similar articles that are no less exciting than Andi. For further information and other inquiries, you can contact us via aroged.