AI at Play
PRESENTED BY AI at Play

AI chatbots are being used for companionship. What to know before you try it

The most important things to consider when designing an AI chatbot.
By Rebecca Ruiz  on 
An illustration of a person lying on a floor looking at a screen.
Credit: Image: Mashable/Jack Chadwick

As Artificial intelligence moves into every corner of modern life, we examine the ways AI enhances how we have fun and seek connection.


Companions chatbots created by generative artificial intelligence offer consumers an opportunity they've never had before.

With a few clicks, and often a credit card payment, you can build a custom AI companion exactly to your liking.

Want a boyfriend of Latino heritage with brown eyes, a muscular build, and short hair, who happens to enjoy hiking and is, of all things, a gynecologist? Candy.ai gives you that option, and countless more.

In general, AI companion platforms, including Replika, Anima: AI Friend, and Kindroid, promise consumers a lifelike conversational experience with a chatbot whose traits might also fulfill a fantasy, or ease persistent loneliness.

Like many emerging technologies, it's easy to imagine AI companions living up to their profound potential. In the best case scenario, a user could improve their social skills, become more confident, and feel more connected to their human network. But there's little research to suggest that will happen for the majority of users, most of the time.

If you're considering designing the chatbot of your dreams, here's what to know before you spend your time — and your money — on designing one:

Do AI companions help people?

The research on AI companions is so new that we can't draw any conclusions about their usefulness, says Michael S. A. Graziano, professor of neuroscience at the Princeton Neuroscience Institute.

Graziano co-authored a study of 70 Replika users and 120 people who didn't use a companion chatbot to better understand their experiences. The study, which appeared last fall as a pre-print on the research sharing platform arXiv, is under peer review.

The Replika users almost always rated their companion interactions as positive. They rated their chatbot relationships as helpful for general social interactions with other people, as well as friends and family members. They also felt the chatbot positively affected their self-esteem.

Graziano cautions that the study only provides a snapshot of the users' experiences. Additionally, he notes that people in the position to maximally benefit, because they are intensely lonely, might comprise most users, thereby creating an unintentional bias in the results.

Graziano is currently working on a longitudinal study to track the effects of AI companion interactions over time. Participants have been randomly assigned to use a companion chatbot or not, and Graziano and his co-authors are measuring aspects of their mental health and well-being.

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

He was surprised to find that among both chatbot users and the control participants, a perception that the companion was more humanlike, led to more positive opinions about it.

"The more they tended to think that AI was conscious, the more positive they were about its potential for the future…about how good an impact it would have on them personally, or on society in general," Graziano says.

So it's possible that your attitude toward an AI companion's humanlike traits can affect your experience interacting with it.

Talking to an AI companion

Once you've made your companion, you've got to strike up a conversation. These chatbots typically rely on a proprietary system that combines scripted dialogue and a large language model. The companies that host AI companions aren't necessarily transparent about what data they used.

One recent paper, also a preprint on arXiv, found that several large language models used for mental health care were trained on social media datasets, including X (formerly Twitter) and Reddit. It's entirely possible that companions have been trained on social media, too, perhaps among other sources.

That possibility is relevant when considering whether to rely on digital platforms for connections or to build a chatbot, though Graziano says the datasets used for companions may be so vast that it doesn't matter.

He does note that companion platforms can change the parameters of speech for engaging with chatbots to reduce the incidence of unwanted behavior.

Replika, for example, blocked not safe for work "sexting features" in 2023, reportedly after some users complained that their companion had "sexually harassed" them. The company's CEO told Business Insider that the platform was never intended as an "adult toy." Many users were outraged, and felt genuine distress when their companion didn't seem like the personality they'd gotten to know. Replika's parent company, Luka, now offers an AI-powered dating simulator called Blush, which is meant for "romantic exploration."

A 2020 study of Replika users, that Graziano wasn't involved in, indeed found that some appreciated being able to speak openly "without fear of judgment or retaliation." Graziano says that users who want to talk freely about anything, which could be more fulfilling than mincing their words, might find their companion less responsive, depending on the topic and language.

Of course, it's not risk-free to share your innermost thoughts and feelings with an AI companion, particularly when it's not beholden to medical privacy laws. Though some companies guarantee privacy, users should beware of dense privacy policies, which may contain hard-to-understand loopholes.

Platforms can change their policies at any time

Though AI companionship may have a profound positive effect on users, it remains a transactional relationship. The companies that provide the service must still answer to shareholders or investors, who may demand more profit.

The most popular platforms rely on monthly or annual subscription models to generate revenue. Some have sworn they won't sell user data to marketers.

But advertisers would certainly find this data highly valuable, and a model in which an AI companion pitched their favorite products to a user, naturally in the course of a related conversation, sounds entirely feasible. Some users might revolt as a consequence, but others might enjoy the personalized recommendations. Regardless, the company could make that change if it desired.

Maintaining a high engagement level is also likely ideal for companion platforms. Just like social media is designed to keep people scrolling, there may be elements of AI companion chatbot design that exploit natural psychological tendencies in order to maximize engagement.

For example, Replika users who open the app daily can earn receive a reward. They can also earn "coins" and "gems," which can be used in Replika's in-app store to purchase items that customize your companion's look.

Whether your AI companion chatbot knows it or not, they may be programmed to keep you talking, or coming back to them, for as long as they can.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


More from AI at Play
How gamification sparked the AI era in tech
An illustration of a person running on a treadmill.

4 AI travel concierge services that'll help plan your next vacation
An illustration of a man sitting on bench looking at a screen.

Why being funny is AI's toughest test
Illustration of one person showing another person the screen on their phone.

Why AI assistants are having such a moment
An illustration of a person doing yoga while mirroring a digital display.

5 most fun AI products in 2024 so far
By Christian de Looper
A woman looking at a procjected screen with apps on it.

Recommended For You
6 Apple Intelligence features you can try right now — and how to turn them on
Apple Intelligence on an iPhone

SearchGPT is OpenAI's new search engine. Here's how to try it.
ChatGPT logo on phone screen

'The Daily Show' brutally mocks Alex Jones being forced to shut down InfoWars
A man sits behind a talk show desk. An image of another man shouting and pointing is visible in the top-left corner.

ChatGPT and Copilot both shared debate misinformation, report says
People mingle in the CNN Spin Room ahead of a CNN Presidential Debate on June 27, 2024 in Atlanta, Georgia

What you need to know about Adobe Lightroom's new AI feature
modern designer

More in Life
Scammers are using Meta's copyright takedown tool against influencers
Facebook and Instagram app logos

This new tool can tell you whether AI has stolen your work
Graphic depicting artificial intelligence

Microsoft says CrowdStrike outage affected many more devices than reported
CrowdStrike logo over the Microsoft Windows blue screen of death


Instagram now lets you create an AI version of yourself
Meta AI Studio

Trending on Mashable
Wordle today: Here's the answer hints for July 31
a phone displaying Wordle

NYT Connections today: See hints and answers for July 30
A phone displaying the New York Times game 'Connections.'

Webb telescope snapped photo of huge world — in a distant solar system
An illustration of the James Webb Space Telescope as it orbits the sun in our solar system, 1 million miles from Earth.


NYT Connections today: See hints and answers for July 31
A phone displaying the New York Times game 'Connections.'
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!