News in English

AI chatbots are being used for companionship. What to know before you try it

Answers to your questions about AI companion chatbots.

An illustration of a person lying on a floor looking at a screen.

Companions chatbots created by generative artificial intelligence offer consumers an opportunity they've never had before.

With a few clicks, and often a credit card payment, you can build a custom AI companion exactly to your liking.

Want a boyfriend of Latino heritage with brown eyes, a muscular build, and short hair, who happens to enjoy hiking and is, of all things, a gynecologist? Candy.ai gives you that option, and countless more.

In general, AI companion platforms, including Replika, Anima: AI Friend, and Kindroid, promise consumers a lifelike conversational experience with a chatbot whose traits might also fulfill a fantasy, or ease persistent loneliness.

Like many emerging technologies, it's easy to imagine AI companions living up to their profound potential. In the best case scenario, a user could improve their social skills, become more confident, and feel more connected to their human network. But there's little research to suggest that will happen for the majority of users, most of the time.

If you're considering designing the chatbot of your dreams, here's what to know before you spend your time — and your money — on designing one:

Do AI companions help people?

The research on AI companions is so new that we can't draw any conclusions about their usefulness, says Michael S. A. Graziano, professor of neuroscience at the Princeton Neuroscience Institute.

Graziano co-authored a study of 70 Replika users and 120 people who didn't use a companion chatbot to better understand their experiences. The study, which appeared last fall as a pre-print on the research sharing platform arXiv, is under peer review.

The Replika users almost always rated their companion interactions as positive. They rated their chatbot relationships as helpful for general social interactions with other people, as well as friends and family members. They also felt the chatbot positively affected their self-esteem.

Graziano cautions that the study only provides a snapshot of the users' experiences. Additionally, he notes that people in the position to maximally benefit, because they are intensely lonely, might comprise most users, thereby creating an unintentional bias in the results.

Graziano is currently working on a longitudinal study to track the effects of AI companion interactions over time. Participants have been randomly assigned to use a companion chatbot or not, and Graziano and his co-authors are measuring aspects of their mental health and well-being.

He was surprised to find that among both chatbot users and the control participants, a perception that the companion was more humanlike, led to more positive opinions about it.

"The more they tended to think that AI was conscious, the more positive they were about its potential for the future…about how good an impact it would have on them personally, or on society in general," Graziano says.

So it's possible that your attitude toward an AI companion's humanlike traits can affect your experience interacting with it.

Talking to an AI companion

Once you've made your companion, you've got to strike up a conversation. These chatbots typically rely on a proprietary system that combines scripted dialogue and a large language model. The companies that host AI companions aren't necessarily transparent about what data they used.

One recent paper, also a preprint on arXiv, found that several large language models used for mental health care were trained on social media datasets, including X (formerly Twitter) and Reddit. It's entirely possible that companions have been trained on social media, too, perhaps among other sources.

That possibility is relevant when considering whether to rely on digital platforms for connections or to build a chatbot, though Graziano says the datasets used for companions may be so vast that it doesn't matter.

He does note that companion platforms can change the parameters of speech for engaging with chatbots to reduce the incidence of unwanted behavior.

Replika, for example, blocked not safe for work "sexting features" in 2023, reportedly after some users complained that their companion had "sexually harassed" them. The company's CEO told Business Insider that the platform was never intended as an "adult toy." Many users were outraged, and felt genuine distress when their companion didn't seem like the personality they'd gotten to know. Replika's parent company, Luka, now offers an AI-powered dating simulator called Blush, which is meant for "romantic exploration."

A 2020 study of Replika users, that Graziano wasn't involved in, indeed found that some appreciated being able to speak openly "without fear of judgment or retaliation." Graziano says that users who want to talk freely about anything, which could be more fulfilling than mincing their words, might find their companion less responsive, depending on the topic and language.

Of course, it's not risk-free to share your innermost thoughts and feelings with an AI companion, particularly when it's not beholden to medical privacy laws. Though some companies guarantee privacy, users should beware of dense privacy policies, which may contain hard-to-understand loopholes.

Platforms can change their policies at any time

Though AI companionship may have a profound positive effect on users, it remains a transactional relationship. The companies that provide the service must still answer to shareholders or investors, who may demand more profit.

The most popular platforms rely on monthly or annual subscription models to generate revenue. Some have sworn they won't sell user data to marketers.

But advertisers would certainly find this data highly valuable, and a model in which an AI companion pitched their favorite products to a user, naturally in the course of a related conversation, sounds entirely feasible. Some users might revolt as a consequence, but others might enjoy the personalized recommendations. Regardless, the company could make that change if it desired.

Maintaining a high engagement level is also likely ideal for companion platforms. Just like social media is designed to keep people scrolling, there may be elements of AI companion chatbot design that exploit natural psychological tendencies in order to maximize engagement.

For example, Replika users who open the app daily can earn receive a reward. They can also earn "coins" and "gems," which can be used in Replika's in-app store to purchase items that customize your companion's look.

Whether your AI companion chatbot knows it or not, they may be programmed to keep you talking, or coming back to them, for as long as they can.

Читайте на 123ru.net