- The new artificial intelligence-equipped tool aims to prevent suicide among homeless youth, using limited resources.
- It works by leveraging real-life social network information.
- It reduces the bias in coverage of homeless youth by as much as 20%.
Suicide rates in youths have increased by 300% between the 1960s and 1980s. In Australia, for example, suicide is the second leading cause of death for individuals aged between 15 and 25.
In the United States, it is the third leading cause of death among teens. According to the Center for Disease Control and Prevention, the suicide rate for people aged 10-24 has climbed over 50% from 2007 to 2017. The National Health Care for the Homeless Council report says that more than half of the homeless people have attempted suicides or have had thoughts of suicide.
To address this issue, a team of researchers in the United States has developed an artificial intelligence (AI) equipped model that can reduce the risk of suicide. The model targets youth and works by leveraging real-life social network information.
The Robust Graph Covering
The idea is to create a network of strategically positioned people who can watch-out for their relatives/friends and provide help when required. The research team has developed an algorithm to do so.
It analyzes a real-life social group and identifies the best individuals to be trained as ‘gatekeepers’ who can then identify suicidal signs in their group and respond accordingly.
The AI takes limited resources into account and ensures that the maximum number of people are being watched out. This scenario represents the problem of selecting a subset of nodes (gatekeepers) in a graph that can cover their adjacent nodes.
The model developed in this study is smart enough to handle real-life uncertainties. If some individuals in the network, for example, are unable to take gatekeeper training, the algorithm still yields robust node coverage.
Researchers looked at the network of young, homeless individuals in Los Angeles, given that 50% of them have had a thought of committing suicide. With limited resources (gatekeepers), the new model can efficiently increase suicide prevention training for this vulnerable population.
In fact, it can help policymakers (who decide fundings of such initiatives) decide the minimum number of gatekeepers who need to receive suicide prevention training so that all people in the network have at least on trained relative/friend who can watch out for them.
Credit: PA Images
The ultimate goal is to build an unbiased AI that can protect homeless youth. The existing algorithms (if implemented without customizations) provide discriminatory results by 60% difference in protection rate across races.
The new algorithm, on the other hand, can reduce the bias in coverage of vulnerable groups of homeless youth by as much as 20%.
The research team believes that it is worthwhile to further investigate the robust covering problem with fairness constraints. It poses numerous challenges and addresses computationally hard problems, advancing the field of computer science and pushing the boundaries of risk management of science.
In addition to preventing suicidal ideation and death, the AI can also be used to protect people during natural disasters.