Using AI to offer a new beginning
By Li Yingxue | China Daily | Updated: 2020-02-12 07:10
High-tech solutions guide specialists to help those who find challenges too daunting.
Research has highlighted the psychological effect of the novel coronavirus outbreak on active Sina Weibo users.
Results compiled from the Feb 6 study by the Computational Cyber-Psychology Lab of Institute of Psychology at the Chinese Academy of Sciences show that after Jan 20, the ratio of the number of negative words concerning anxiety, bad health and death is rising, while the ratio of words about family and friends is decreasing. The level of anxiety, depression and obsession among users has also risen sharply.
The report is not the first time that these researchers have analyzed user psychology based on micro blog content, especially negative posts.
In 2013, Zhu Tingshao, director of the lab, led his team to adopt artificial intelligence technology to discover words on Sina Weibo linked to potential suicide attempts and organize psychological consultant volunteers to prevent them.
"We use areas of AI technology, such as language processing, to collect patterns from users who may be experiencing suicidal feelings," he says.
According to the World Health Organization, close to 800,000 people die from suicide every year, which equates to one person every 40 seconds. Suicide is the second leading cause of death among 15-29 year-olds globally. Traditional detection methods, such as using questionnaires or encouraging patients to discuss their feelings and experiences, are normally considered insufficient.
In March 2012, a Sina Weibo user nicknamed Zoufan, a college student in Nanjing, committed suicide. Her last few social media posts indicated how depressed she was and, since she passed away, the comments of her posts have become an outlet for people to share their own unhappy feelings. There are over 160,000 comments on her last post. That inspired Zhu to set up a team to help identify people who express suicidal thoughts online.
"The difficulty of preventing suicide is that the number of those seeking active help is low," Zhu says, adding that even though there are hotlines to help intervene, sometimes it's just too late.
His team aims to bring forward prevention engagement, and uses AI to focus on vocabulary to identify those at risk.
"Unlike English, where words tend to have a singular meaning, in the Chinese language, depending on context, the combination of words can mean different things and have different implications which makes it harder to detect," Zhu says.
Using AI to decode the meaning has registered an accuracy rate of over 80 percent when screening potential suicides, he says. His team members will make the final judgment of whether or not to send a private message to users.
AI provides new research methods and data analysis, as a technical support, according to another researcher, Zhao Nan, at the Institute of Psychology, but on the other hand, Zhao says, it raises new questions on psychological research, such as how to use it properly to match human behavior.
"When training the AI model, if we ask for higher accuracy, we may miss something, but, when screening for the potential suicides, we choose to loosen the accuracy rate rather than overlook anyone," Zhao says.
After the potential attempts are selected, the first private message is sent out by computer. At most, five automatic messages will be sent if the user does not reply. Each carefully-written message includes the details of hotlines and websites where users can ask for help.
A dozen qualified volunteer psychiatrists follow up to offer help once the user replies. The volunteers use one account to contact the users. They communicate mostly through private messages, and occasionally they make contact via phone call.
"Some users will open up about their difficulties or problems, and some will just reply 'thanks', which at least means we have sent them some comfort and hopefully it will help," Zhu says. "It might be their motivation to finally solve the problem."
Zhu is specialized in cyber-psychology, which is to predict personality, mental health and the social well-being of web users based on web behavior.
He has two PhDs, one on speech synthesis from the Institute of Computer Technology at the Chinese Academy of Sciences, and the other from the University of Alberta in Canada. It's rare to have a researcher who has a background in computer science at the Institute of Psychology, so when conducting cyber-psychology research, Zhu is the right person to combine his AI background with psychoanalysis.
From July 2017 to April 2019, 3,733 Sina Weibo users communicated with the volunteers as the computer sent out messages to some 30,000 people. About 77 percent showed a positive attitude and only 10 percent replied negatively.
After communicating with the users who had suicidal thoughts, Zhu noticed that as well as some people who suffer from mental illness, half of them are young people, and that they are facing problems that, to them, seem insurmountable.
"The most obvious problems that distress them are family issues. Other factors are work, study, relationships or bullying," he says.
Zhu's study also shows that half of those considering suicide never seek help from others, and even when they do, family members are not their first choice-they prefer friends or even online strangers.
"It's something we need to rethink, and we need better family education, including death education," Zhu suggests. "It's not only a mental health issue but also a social issue."