Is it possible to fall in love with AI? Many would say it is not. However, Gen Z does not mind establishing an emotional connect with AI. AI relationships are described by a new coinage — AI-lationships.
Of course, AI-lationships are not a substitute for real human connections. Real connections provide emotional support enhancing the emotional well-being. Some Gen Z members feel that AI partners can fully replace human companionships.
Chatbots are trained to be nice. They replicate mammals in intimate relationships. They are patient listeners. They are not judgmental. It is a non-judgmental space. Humans can express and there is no stigma attached. This could lead to dependence. It is a red flag. AI and especially voice AI and simulated bodies replicate or imitate human connection, say by non-verbal cues like the warmth of voice.
There is a change seen in how the youth socialise. Interacting with AI is on par with texting with friends. It becomes a sort of emotional safety net. It is available always. There are no frustrations.
Generative AI relationships pose a danger to children. There are platforms such as Replika, Nomi and Character AI, and these show alarming results. Adolescents are unduly influenced. They can commit unwanted things. They can promote harmful behavior.
The safeguards are not enough. There are mental illness identifiers incorporated on the models. These are not uniformly effective. They are not enforceable too.
Some models are immersive and some general purpose. We should draw a line of distinction. ChatGPT, and Gemini are general purpose models. They do not try to mimic emotional relationships.
At present, we vacillate between technological innovation and emotional velnerability AI-lationships provide solace but at the cost of ethical issues, mental health issues, and future of fundamentals of human connection.