Following the popularity of Chat GPT, an artificial intelligence (AI) tool, the number of people using AI tools to write various texts has increased. Some media outlets are even using Chat GPT to write articles while job seekers are using it to write resumes. It can generate lengthy texts that would take humans a long time to write in a matter of seconds. Recently, it has been used for various tasks such as writing business messages through chatbots, research, and solving coding problems.
In June, a study highlighted that these AI tools can increase the success rate of dating.
Attractiontruth, an AI-based dating platform, had over 1,300 male users of dating apps use Chat GPT for profile introduction phrases and message coaching.
As a result, those who received help from Chat GPT reported being able to converse with the opposite sex more confidently than before. In particular, the response rate to messages modified by Chat GPT was higher than before. This study added weight to the opinion that AI can help improve human relationships.
Could indiscriminate use of AI tools, if discovered, be a poison to human relationships?
However, the indiscriminate use of AI tools, thinking they help improve human relationships, could lead to problems. According to a new study from Ohio State University on September 11 (local time), if the other party finds out that AI has been used, the relationship could deteriorate.
The study was conducted online with a total of 208 adults. The study was conducted under the premise that the participants have a long-time friend named Taylor. Various scenarios were set up, such as the participant needing comfort because they are in a very difficult state, needing advice due to conflict with a colleague, or with an approaching birthday. Participants were instructed to choose one of the three scenarios and send a message explaining the current situation.
All participants then received an AI-written response to each situation. However, during this process, the participants were told different things. Group A was told that Taylor’s response used an AI tool. Group B was told that Taylor sent the message with the help of someone else. Lastly, Group C was told that Taylor sent the entire message themselves.
Does uncertainty about close relationships reveal the downside of AI tools?
Interestingly, even though the same person sent the reply, whether or not an AI tool was used affected positive or negative perceptions of the reply. Group A, who heard that the message was written using an AI tool, was less satisfied with their relationship with Taylor than the other groups. Especially, Group A was less certain about their close relationship with Taylor.
People don’t like AI-generated responses because they believe it is inferior to humans when exchanging personal messages. It can be useful to provide answers to things that theoretically have a correct answer or to create sentences based on the data it has. However, it’s a bit different in situations where emotional conversation is required. Of course, Group B, who heard that Taylor received help from someone else, also showed a negative reaction to the reply. Ultimately, this is based on the idea that third-party methods such as other people or AI should not be used to maintain a friendship.
More importantly, the study shows that using a third party to write messages is perceived as putting less effort into relationships. Using AI tools without the other person knowing can be helpful, but the moment it’s discovered, it can poison the relationship.
Of course, most people won’t tell the other person that they used AI to send a message. However, when technology is being abused everywhere, the study ultimately makes us think about what is important in relationships. What’s essentially important in human relationships is sincerity and effort. A slow but sincere message can convey more sincerity to the other person than a sentence quickly written by Chat GPT.
By. Soo Hyun Lee
Most Commented