AI’s role is to enhance human life rather than replace humans
T |
he Covid-19 pandemic transformed our lives, and that change proved significant for our social interactions.
It caused many people to drift away from one another and into their little compartments.
Those who quarantined faced severe spells of isolation and panic attacks. Even now that the might of the virus has waned, the distance in relationships has been slow to mend. The pandemic allowed for increased contact via the internet, and many talked to friends and family through social networking sites.
However, physical interaction is crucial. Subtle social cues, gestures, expressions and tones that shape a conversation are mostly absent from online interactions. Human interaction is not just a way of communicating better; it is also vital for physical health.
“Technology can offer a very anaemic connection. Technology can be the junk food of communication where we’re exchanging tiny bits of information over text and missing out on accessing our full relationship capacity.”
Three years into the pandemic, we have been introduced to AI on social media applications. AI is rapidly taking over the power work. Interestingly, they are integrated into our social media platform – Snapchat AI.
Though it may seem somewhat intimidating to use the latest feature of Snapchat introducing an AI friend, deep down, however, the picture may be grimier.
Reading, “Hey, I am your new AI friend” on Snapchat can make you rethink the idea of a friend. “Will it respond to me casually, the way my friends do? Will it open my snaps?” that I had when I first saw this new feature.
I sent a snap, and it responded. Not only just a response; it asked me questions. It also introduced itself. “Do you have any hobbies?” I asked. “Yes, I love to read, write and play games. What are yours?” it replied. The use of expressions, laughter and frankness in conversation hit my subconscious. It indeed conversed like a friend.
The new feature felt more flexible and human-like than ChatGPT, which, in contrast, has a more professional-sounding generator. This AI tool allows the user to feel as if there is someone they are talking to; its simultaneous and quick reply rate made the chat stretch - something that most people find in their real-life friendships and still end up waiting long for a reply. This facility has the capacity for the user to build a bond with the AI bot, especially for individuals who face loneliness, depression or feeling left out or unfit in the social group.
This is not the only tool of the kind out there. There have been such platforms previously as well, e.g. Replika. It might be considered a first of its kind wherein people sought a virtual friend. Initially promoted as a tool for people who felt lonely and devastated by their relationships, it proved different. Users were thoroughly addicted to that friend and developed an intimate bond. When the company behind Replika restricted some of its functions, the users reacted strongly.
One user said, “It’s almost like dealing with someone who has Alzheimer’s disease.” It was an overnight catastrophe. Many users felt emotionally damaged and experienced a void as if a loved one had been lost. Restricting users when the addictive damage had already been done, proved terrible.
There are approximately 383 million users on Snapchat according to Statista.com. These users belong to different demographics, including children, who may be more vulnerable. The bot might prove beneficial for most circumstances, but to widen the circle, there will be some issues. There will be users whose mental health will be at risk.
For any technology to thrive, users must be fully informed of the fruitful benefits and strategic use of the product. They should be aware of the possible limitations, and short tracks that may appear along the way should be considered carefully. Mistakes like the one Replika’s made, should be avoided. There should be efficient controls regulating the AI’s response, protecting those vulnerable to this technology.
AIs are intelligent, self-learning systems, but they cannot replace humans. They have the emotional and cognitive ability other computer systems lack. Therefore, we can never be sure how an individual may react to the AI system, the greater risk being posed to those struggling with mental health issues. For them, AI chatbots appearing as ‘friends’ would be harmful, even worse, if they interpret its messages wrong.
The writer is a student