Facebook & Suicide Prevention: Are There Ethical Issues?

ROCK HILL, S.C. – A Rock Hill man, upset after being evicted from his apartment, went on Facebook Live in May, threatening to hang himself. His girlfriend called the police. And so did Facebook. In the 911 call a Facebook employee says, “(Inaudible) from Facebook. The reason we’re calling today is because we just received (inaudible).” The 911 dispatcher responds, “(A) suicidal subject?” The Facebook employee replies, “Yes.”

Facebook stayed on the phone with Rock Hill 911 dispatcher Courtney Davis, relaying details from the man’s live video. The 911 dispatcher is heard asking, “OK, you see a noose?” The Facebook employee replies, “Yes, he’s tying a noose on…”

Facebook also provided 911 with the man’s location. Davis says it was the first time she’d ever worked with Facebook on a call. She says, “So once they gave me the latitude and longitude, I plugged it into our system.

Facebook was able to locate the man, and send police.

“He had some towels in a book bag and he said he was going to hang himself,” says Rock Hill Police Sgt. Bruce Haire. He was first on the scene. Once back-up arrived, the officers took the man to the hospital for an involuntary mental health evaluation. Haire says, “In this case, it had a good ending.”

Facebook tells WCCB that in the last year, it has “helped first responders quickly reach around 3,500 people globally to conduct a wellness check.” Facebook’s method for assessing posts includes two parts: computers and people. They say their artificial intelligence algorithm looks for certain words, in a certain context, certain types of comments on the post, and time and day of post.

As for the people part of the equation, Facebook says their specially trained teams have safety, police and crisis experience.

“Interesting, interesting was my first reaction,” says licensed clinician Justin Perry. Perry says he welcomes any chance to help keep someone safe, but has more questions about Facebook’s specially trained (suicide screening) teams. Perry says, “We need to know who’s doing it, what is guiding it, and again, knowing that you are training to do what you’re doing and it’s not just kind of a gut feel.”

Critics say Facebook’s calls to police could force non-suicidal people into unnecessary mental evaluations, or prompt arrests or shootings. In the Rock Hill case, it generated a public police incident report about a non-criminal issue.

Perry says those potential ethical impacts should be weighed carefully. “I think its very important that there be transparency, not only with their process, but also transparency with their users as well,” he says.

Users cannot opt out of Facebook’s suicide prevention screening. WCCB did call the Rock Hill man whose name and phone number were on that public incident report and asked if he wanted to participate in this report. He told WCCB he had no comment.