AI can have friendships, but it is essential to know how deep or shallow the cause of relationship between them. AI can analyse and react to a myriad of emotional cues, but the ability for them to really talk like a friend largely dies with the technology. Replika is a machine learning bot that mimics human behavior, and gives us an idea of what AI-friendship looks like. For instance, Replika, which has more than 10 million active users and studies suggest that 62% of users have a feeling of emotional connection with their AI companions. Part of the bond comes down to a system remembering facts from past dialogues and responding with an empathetic comment. It is important to understand that AI like Replika are not people and therefore it does not have an actual history of personal experience or emotional understanding (it only has patterns and data).
Some emotional language invokes a remarkably high quality of AI response, while other emotions elicit an extremelyLow-quality output. According to MIT, 78% of respondents in a similar survey conducted this year answered yes when asked if they believed while AI can fake empathy, it cannot match the complex, real emotions we have with a human friend. During these interactions, AI uses algorithms that analyze the words, tone and context of human communication to come up with responses that sound appropriate. For example, if somebody says they are sad, an AI such as Siri or Alexa may offer soothing words in response, but these responses are planned out of generality of empathy instead of specific tenderness and care.
But at the same time, AI technology gets better in allowing for customized experiences. According to a 2021 report by the International Journal of Human-Computer Studies, AI usage in mental health support is on the rise. Applications driven by AI(artificial intelligence), such as Woebot, have been effective in making people cope with anxiety and stress, 63 % of those who find these Apps reported effectiveness of improvement within two weeks. Similar to a well-meaning friend, Woebot is trained on principles of cognitive-behavioral therapy (CBT) and freely dispenses advice about how to handle your feelings. These are not about the AI replacing human interaction, but using structured psychological methods to guide and comfort.
One more aspect of AI talking like a buddy is an issue of ethics. AI Ethics experts like Dr. Kate Crawford of Microsoft for instance, believe whether friendly or simply functional, that it only appearing to provide companionship cannot forge an emotional bond. The intimacy of canine companionship gets lost, however, in doctor Crawford’s contention that “there are no dogs who will be constantly by your side to cuddle, play and talk with — AI is not conscious and cannot know friendship; it can only seem to.” AI systems have been and are becoming more conversational, and this caveat is worth keeping in mind.
Well, in one way AI can talk to you as a friend. Sure, AI can replicate most activities of friendship like being supportive or having chit chat but the connection runs deeper than that and falls into emotional intelligence which is absent in AI interaction. While AI provides solace, companionship and mental health support that would otherwise be lacking in a human friend; it simply cannot provide the same level of emotional connection. When talk to ai, you might feel nice in some situations, but do bear in mind that it is not a replacement for actual human interaction.