Toronto Metropolitan University's Independent Student Newspaper Since 1967

(SAIF-ULLAH KHAN/THE EYEOPENER)
All Business & Technology

AI therapy apps: Convenient, but at what cost?

By Aneesa Bhanji

When O Stecina noticed that the journaling app they’d been using for the past five years added a new artificial intelligence (AI) feature, they thought they would give it a try. After paying extra for AI to analyze their journals, Stecina, a fourth-year RTA media production student, said the advice and reassurance it provided didn’t meet their expectations.

“It didn’t feel [like] any deeper of an insight than I would have gotten from somebody that I talked to for an hour,” they said. “It didn’t feel as personal as I expected it to be.”

Journey, Stecina’s journaling app, is an online diary software that announced the introduction of Odyssey AI in December of 2023. The interactive chat-based bot lets users ask questions and receive responses using generated content from their past journal entries, according to Journey’s website.

While Canadian university students might use apps like ChatGPT to help them with their schoolwork, some are also turning to generative AI chatbots for life advice or therapy. According to Forbes, Gen Z uses AI an average of 12 times a week, compared to seven for Gen X and four for Boomers.

Stecina said their experience left them feeling concerned about the possible negative impacts of AI therapy apps, especially when it comes to privacy.

“I looked back at what I had done, and I was like, ‘this is freaky’,” they said. “If everything that [the AI is] reading is coming from me anyway…I don’t know what they’re going to use that data for.”

Stacey Ann Berry, an AI policy research member with the Centre for AI and Digital Policy, encourages students to explore emerging technology but to also consider how they’re compromising their own personal information if they’re unaware of how their data is being used. 

“It’s concerning for their privacy, because with these AI systems, it’s not fully transparent as to what is happening to the data we provide it,” Berry said. “There is no guarantee that your privacy information will be safeguarded.”

According to the Harvard Data Science Review, large language models in AI applications are found to encode biases, hallucinate content and reveal sensitive information that threatens privacy or security. 

Berry advised students to be protective of their personal data and mindful of “hallucinations” by double-checking the information’s accuracy. 

Even though the AI chatbot may have told Stecina what they “wanted to hear,” they don’t think the app’s feature will be helpful for them in the long run.

“I also am very wary of getting advice from the AI, because it has its own set of agendas baked in by virtue of it being created by a company,” said Stecina. “I think that it’s really easy for you to get caught in a feedback loop that can make a lot of your symptoms worse.”

Much like the concerns Stecina has about the backend agendas of certain AI platforms, Huda Idrees, the founder and chief executive officer of personal health data platform Dot Health, said students should be aware of  incorrect results when using generative AI like ChatGPT to receive mental health advice. 

“The algorithm is designed by a very small number of people and we don’t know how it encapsulates the issues around mental health,” said Idrees. 

She also said inaccurate health information generated by ChatGPT can make symptoms worse or lead to a misdiagnosis, which is not a risk worth taking. 

For Quentin Stuckey, a 2022 graduate from TMU’s literatures of modernity master’s program, his concerns with AI therapy apps go beyond privacy and algorithms. Stuckey said that after testing out an AI therapy diary app called Reset, he thinks AI chatbots can create a sense of disconnection for students. 

“There’s a lack of a physical human being there, the lack of connection and empathy,” he said. “There’s something cold and detached about communicating with the machine.” 

Stuckey said that elements from a therapist like their body language or feeling “like someone cares,” are what’s missing from interacting with AI. 

Stecina suggested that this vulnerability is what can make turning to AI for advice appealing to students. 

“Talking about stuff that’s very personal to another human being is scary, it’s embarrassing, there’s a whole host of emotions that go with that. And even asking for advice from a friend or your peers, that is very daunting,” said Stecina. 

“So I think that people turn inwards, and when they can’t, when they don’t have the answers, then they’re like, ‘okay, how can I do this in a way that as few people know about it as possible?’”

Idrees said students using AI chatbots instead of professional therapy is a symptom of a larger problem. 

“Especially [for students] in school post-COVID, we live in a very isolated world, so this increase in a desire to access mental health resources is also a direct result of people not having friends [or] big social circles.”

According to Universities Canada, 74 per cent of students reported that the pandemic worsened their pre-existing mental health challenges and 61 per cent reported developing new challenges. 

Clinical psychologist Dr. Diana Brecher said one of the main reasons students are turning to generative AI for therapy is its easy accessibility. 

“It’s faster to get an appointment with AI than it is in a counselling centre. And certainly, if you leave a university and you’re out in the world, it’s pretty expensive to get treatment, so that will be a big factor too,” she said. 

The Canadian Institute for Health Information reported that from 2023 to 2024, half of the Canadians who were referred to publicly funded community mental health counselling had their first scheduled session within 25 days of their referral. One in 10 people waited almost five months or more. 

Third-year nursing student Emma Lusk said while AI therapy apps can potentially help students on a short-term basis, it isn’t a long-term fix. 

“I feel like the instantaneous reply, instead of waiting for an email from a therapist or getting put on a waitlist to see somebody, is very beneficial for students,” Lusk said. 

“But if you want more of a personalized care, then having an actual therapist is a better idea,” she said.

Idrees added that although AI may have positive outcomes when used in collaboration with other resources—such as seeing a psychologist or psychiatrist—the technology is not “there yet” in becoming a replacement for therapists. 

“We haven’t really reached the sort of Skynet-level AI that James Cameron depicted in Terminator. We overestimate the power of AI. Humans are far, far superior,” Idrees said.

Until then, Idrees suggests students find a community to build a sense of friendship amongst others—which is exactly what Stecina decided to do. 

“I found a lot of joy was added to my life when I joined a choir. Just having something to go to every week where I’m with a bunch of people really helps to ground me in reality,” they said. 

Stuckey also said that having a close group of friends, even at a commuter school like TMU, is important for students to feel less isolated. “I think that establishing a community, even if it’s just a close group of personal friends, really goes a long way, because you really don’t have to go through things alone,” he said.

Leave a Reply