South Korea has one of the highest suicide rates in the world, with about 27 suicides per 100,000 people in 2019—by comparison, the US rate that year was about 14. The 27 bridges crossing the country’s Han River have a bad reputation for drawing suicide attempts. But that may be about to change with the help of an artificial intelligence-powered CCTV system.
In an attempt to prevent suicides, Seoul, the nation’s capital, has established four centres along the Han River where workers monitor live video feeds from nearly 600 CCTV cameras fixed on 10 bridges. As of now, if a worker sees someone attempt suicide or suspects that a person is about to jump, they can have rescuers at the bridge within just four minutes.
Sadly, it is often hard for human surveillance teams to tell—resulting in action being taken too late in some cases. The current system has allowed Seoul to save 96 per cent of the nearly 500 people who set out to commit suicide at the bridges every year, but it leaves out the remaining 4 per cent of cases as ‘unpredictable’.
Furthermore, if a monitor sends a rescue team out when one isn’t needed, that’s a waste of resources. If they dismiss an actual suicide attempt as someone who’s simply admiring the view, that could lead to the loss of a life.
But the AI system they are developing has been learning patterns of behaviour by analysing data from cameras, sensors, and the dispatch records of rescue services since April 2020, Seoul Institute of Technology said on 30 June.
Based on information from hours of CCTV footage and assessing details such as the hesitation of the person, the AI can then forecast a hazardous situation and immediately alert rescue teams, principal researcher Kim Jun-chul told Reuters.
“We believe the new CCTV will enable our crews to detect the cases a bit faster and help us head to a call more promptly,” Kim Hyeong-gil, who is in charge of the Yeouido Water Rescue Brigade, told Reuters as he monitored real-time footage from bridges on Seoul’s Han River.
Although the programme is only being tested for now, the city aims to use what it learns to fully launch the AI-powered system at the end of the year. By greatly reducing false alarms, the system could save many lives. The number of rescue dispatches surged about 30 per cent in 2020 compared to the year before and many of the attempts were made by people in their 20s and 30s as the coronavirus pandemic brought greater economic hardship and increased the battle for jobs, the rescue brigade’s Kim explained.
Of course, video surveillance is an ethically complex subject, and some have already expressed concerns that Seoul’s AI is an invasion of privacy that will also be used to track people. “At the very least, the government should be providing signage and give notice to the public walking on these bridges that these new measures are in effect,” Ann Cavoukian, former privacy commissioner of Ontario, Canada, told CTV News.
Currently, Seoul has CCTV operators working on three rotating shifts that cover 24 hours a day, seven days a week, at four different control centres in the Yeouido, Banpo, Ttukseom and Gwangnaru neighbourhoods on the river.
Only time will tell whether the AI actually improves Seoul’s ability to predict suicide attempts and send help in time to stop them. But if it does work as hoped, similar AIs could one day monitor other high-risk locations, potentially helping lower the rate of suicides (among many other things).
The US army has faced a growing problem in its rank for decades now. According to Pentagon data, between 2014 and 2019, the suicide rate for active-duty troops rose from 20.4 to 25.9 suicides per 100,000. In 2019, there were 7,825 reports of sexual assault involving service members as victims, according to The New York Times. In the last three months of 2020, suicides among National Guard troops nearly tripled to 39 from 14 over the same period the prior year.
But how do you prevent militaries from committing suicide or sexually assaulting each other? You train members of the US army to intervene whenever they see alarming signs. Which tools do you use for this very specific type of training? Virtual reality (VR) headsets. At least, that’s what the Air Force is testing at the moment.
Years of prevention training—often in the form of somnolence-inducing PowerPoint presentations—have done little to reduce the rates of either problem. Whether the VR model can ultimately do better remains an open question, but military officials are encouraged by the early self-reported responses to the training.
So far, over 1,000 Air Force personnel have participated in the training, reports The New York Times. 97 per cent of those who tried it would recommend it, and trainees reported an increase in the likelihood to intervene with a person in crisis, Air Force officials added. Among those ages 18 to 25—a generation more used to interactive virtual experiences make up the majority of new recruits—the impact increased sevenfold. Officials intend to train at least 10,000 airmen with the programme this year.
The VR programme is based on the fact that intervention by bystanders has been approved by experts as one of the few effective tactics for both problems. If they witness harassment in a bar, for instance, or alarming messages on social media representing a suicide threat, then they’ll know how to approach those problems tactfully.
The programme places airmen in related scenarios with photo-realistic actors and coaching on which responses would be constructive and which may not. “You are an active participant. You have to be ready. I think that it is going to help airmen retain and remember knowledge,” said Carmen Schott, the sexual assault prevention and response programme manager for the Air Force’s Air Mobility Command. He continued, telling The New York Times: “We don’t want people to feel judged. They may not make perfect decisions, but they will learn skills.”
In the military, many barriers can get in the way of people trying to intervene—especially against someone of a higher rank. As of now, airmen going through the programme have only been interacting with suicidal virtual colleagues via their headsets. Another bystander programme, which will roll out in July, will place the users in a bar, watching a scene of sexual harassment unfold.
But this VR programme is not the only educational tool with more of a hands-on approach. Gamification seems to also play a crucial part in successful and impactful teaching. With the growing use of computer-based therapy in mental health and promising results seen in the use of gamification in psychotherapy, the push to gamify sex education is part of a broader movement—deploying video games to target health issues ranging from depression to tobacco use.