This article was previously published in the Globe and Mail.
Crisis hotlines, like Canada’s new 988, promise confidentiality. So why do so many trace calls and texts?
*
Rob Wipond is an investigative journalist and author of Your Consent Is Not Required: The Rise in Psychiatric Detentions, Forced Treatment, and Abusive Guardianships.
*
A teen recently called a self-described “anonymous and confidential” crisis hotline to talk about his feelings – then, minutes after the call ended, police arrived at his home, handcuffed him in front of his confused and horrified parents, and took him to a psychiatric hospital.
The hotline call-responder had decided the boy might be at risk of killing himself, and covertly contacted 911 to trace his mobile phone. At the hospital, the boy’s belongings were confiscated, he was ordered to strip naked for bodily inspection, and – now sobbing uncontrollably – he was forcibly tranquillized. “It was a living hell,” the boy told me. “I felt like my world was ending, and everyone was making it worse.”
He then broke off communication – afraid that I, too, might breach my promise of confidentiality.
Many crisis and suicide hotlines have practiced this kind of call tracing for decades, but they have made efforts to keep it secret. Technology has made call and text tracing easier, and complaints have become more visible on social media, where unwitting callers and texters describe feelings of betrayal and the devastating impacts of police appearing at their homes, workplaces or schools and hauling them off for psychiatric evaluations – and, sometimes, prolonged hospitalizations and involuntary treatment. Many say they’ll never feel safe reaching out for help again.
What’s more, there’s no clear evidence that forcibly hospitalizing someone helps more than harms. Studies show even expert predictions of suicide barely beat random chance. Worse – perhaps because psychiatric hospitals tend to be depressing places – a meta-analysis found that the suicide rate was approximately 100 times the global average suicide rate in the first three months after hospitalization, and 200 times that rate for those who’d been admitted with suicidal thoughts.
Disturbingly, these kinds of incidents will likely become more common in Canada with the launch of the national 988 hotline. Starting November 30, Canadians will be able to text or call the three-digit number to access free mental health and suicide prevention support. That system, overseen by Toronto’s Centre for Addiction and Mental Health and modelled after the U.S. 988Lifeline initiative, will employ tracing by covertly connecting to 911.
So it’s time to ask: Should government ban such non-consensual call and text tracing – or at least ban crisis hotlines’ false advertising about confidentiality?
Publicly, hotline operators often imply that they only trace the calls of people actively attempting suicide. In fact, most call-tracing policies, including what’s planned for Canada’s 988, apply to much larger spectrums of people deemed to be at “imminent risk.”
Imminent risk refers to predicting harms that might emerge hours or days later. It includes suicidal feelings, and other distress or behaviours that could lead to harms. If a caller has a plan and the means to kill themselves, then they’re at imminent risk and might receive an “emergency intervention” from police – with or without the caller’s knowledge or consent.
But how many people with suicidal feelings do not have any plan nor access to any means of killing themselves? More problematically still, the (often-volunteer) call-responders are trained to ask – without disclosing that it’s an assessment question – “If you were going to kill yourself, how would you?” Having an answer raises your risk score dramatically.
Imminent risk policies are recipes for wildly subjective judgments about apparent risk levels, especially during an average 10-minute hotline conversation. Some U.S. 988Lifeline centres send out police at 40 times the rate of other centres, with little to explain the differences besides call-responder attitudes. (Until recently, Kids Help Phone – a part of Canada’s 988 – even acknowledged in its privacy policy that sometimes, its AI bot monitored conversations and triggered the emergency intervention process. After this article first appeared, Kids Help Phone changed their policy and issued a statement that the bot “triages” certain conversations for “assessment” by KHP staff for possible interventions.)
Canada’s 988 will disclose in its terms of use that contacts could be traced without consent, but won’t disclose it directly to callers or texters. And the imminent risk policy won’t be shared publicly. Psychiatrist Allison Crawford, the initiative’s medical director, told me in an interview that existing data from Canadian call centres show that about 3 per cent of contacts get subjected to emergency interventions, which often involve call and text tracing. Dr. Crawford described this as “a relatively smaller percentage.” U.S. 988Lifeline operators describe their similar rate as “rare” – though the rate specifically for callers with suicidal feelings is often five times higher. The Public Health Agency of Canada projects that 988 will see more than 500,000 interactions in its first year – so is 3 per cent of that, or 15,000 people, really such a small number?
And is tracing based on imminent risk actually legal? That’s yet to be visited by privacy commissioners or courts. Some privacy policies – like that of the Crisis Centre of BC (which is joining the 988 system) – state that personal information is shared without consent “only as authorized by law.” A spokesperson for British Columbia’s privacy commissioner clarified that any breaching of privacy without consent must meet a high threshold of protecting health or safety, and be “clearly in the interests of the individual.”
Dr. Crawford argued the practice meets that threshold, saying, “We’re just trying to keep the person safe in that moment.” Yet minutes from internal U.S. 988Lifeline meetings show widespread recognition that police visits and forced hospitalizations can be “traumatizing” for people in emotional distress and create “dangers of brutalization, violence, and criminalization”. While Dr. Crawford said she has heard such concerns, she nevertheless believes that “emergency intervention is a necessary part of the service.”
Different approaches are possible. Call and text tracing could be abolished – after all, some hotlines never do it. Or the policy could be narrowed to truly apply only to those actively engaged in killing themselves. Or, the first time someone contacts 988, the policy could be openly discussed.
But for Canada’s 988, “confidential” is for promotional purposes only. Many call centres and crisis texting services also collect contents of conversations and share them with “third-parties” for “research” and “service improvement.” Sometimes, those conversations are even used for profit, as was the case for Crisis Text Line (also part of Canada’s 988 via its partnership with Kids Help Phone), which shared data with its for-profit AI spin-off until that practice was reported by Politico last year. And it’s highly debatable whether the often automated processes by which this data is “anonymized and aggregated” before sharing would detect and purge all potentially revealing details about people’s personal fears, family conflicts, or workplace frustrations.
So, for those who truly value confidential conversations, we may have only one choice: We must stop advising people to call crisis hotlines, and instead make ourselves more present to listen to and support each other – in private.
After 25 years as a Psychologist, Rob Wipond is spot on in identifying problems in the field. Horrifyingly unhelpful and going backwards. There is no reason to treat humans this way.
Spot on and horrifyingly unhelpful.