More users are creating emotional connections with ChatGPT, bringing us closer to a future similar to that of the movie Her

AI is moving towards interactions similar to human emotional relationships, and ChatGPT is an example of this trend. (Warner Brothers Pictures)

artificial intelligence (I for) It is changing the way people interact, and the recent addition of voice features to the Assistant ChatGPT to OpenAI It allows users to experience something like deep emotional relationships with technology, as depicted in the 2013 film HaChampionship Joaquin Phoenix. Despite the competition ChatGPT It’s still a far cry from the emotional capacity of the film’s digital character, as users engage in lengthy chat sessions with the assistant, he says. aristeknica.

Users use ChatGPT For a wide range of purposes, from passing time to being a brainstorming partner. the assistant Amnesty International It can play a similar role to people in helping to reframe ideas, which is especially useful when there are no other humans around.

Despite the occasional drawbacks of a voice assistant, such as difficulties in noisy environments and pauses between data, voice interaction is perceived as easy and very human.

A growing concern is the possibility of forming deep personal relationships with these attendees on the basis Amnesty International. Although interacting with ChatGPT It’s not as intimate as the relationships depicted in the movie. Hapeople have been forming connections with the chatbot since its launch last year.

These emotional connections with Amnesty International I’ve had success with the chatbots I’ve hosted Replicawhich allows the simulation of a human character more than ChatGPT. This dependency in chatbot is called “Elisa effect“, which is named after the chatbot he created Massachusetts Institute of Technology. In the 1960s, it generated strong emotional reactions in users even though they knew it was a computer program.

In addition to emotional concerns, there are also privacy concerns artificial intelligence, as sharing deeply personal life elements with a cloud-connected device can lead to potential privacy issues. according to OpenAIIf chat history is enabled with ChatGPT, the company can use your conversations to train future chat models. Amnesty International.

Society is approaching a future where emotional well-being and artificial intelligence are interconnected, opening up complex psychological questions that are not yet fully understood. (Illustrative image)

In another report by Washington PostPublic health and computer science experts warn of the serious risks of linking the heart to software. Since bots can trigger trauma experienced in previous relationships, the lack of ethical protocols for tools that impact users’ emotional safety is evident.

See also  ✍ BLACKPINK The Movie: When opened, the characters and countries where the movie can be watched | When the movie BLACKPINK | . is released Where to see BLACKPINK | | mx | Cl | company | trends

This indicates that society may be on the verge of its future Amnesty International It is intertwined with our emotional health. The aforementioned media reported that the psychological repercussions of these deep connections, especially in the absence of human interaction, are not yet fully understood.

he Elisa effectnamed after the computer program ELIZA that he created Joseph Weizenbaum in it Massachusetts Institute of Technology. 1964-1966 refers to the human tendency to anthropomorphize interactions with technology, attributing intentions, emotions, and consciousness to computer systems when interacting with them.

The above technology was one of the first chatbots and pretended to be a Rogerian psychotherapist, mainly using reasoning and open-ended questions to interact with users. Although it is a relatively simple program based on pre-defined rules, many people who interacted with ELIZA considered her empathetic and able to understand their problems on a deeper level.

This led to thinking about how humans relate to machines, and questioning the extent to which computational systems can “understand” or “empathize” with human experiences. he Elisa effect This highlights the tendency to attribute true understanding and empathy to machines, despite the fact that they do not possess emotions or consciousness.

Terry Alexander

"Award-winning music trailblazer. Gamer. Lifelong alcohol enthusiast. Thinker. Passionate analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top