“Taking care of yourself is always the first step.” Fahad Al Saud tells me one rainy afternoon. I’ve come to visit the minds behind Tomo, a new app that uses behavioural activation to help sufferers of mental health through their daily activities, both mundane and meaningful. CEO Gus Booth-Clibborn and Chief Product Officer Fahad, who is also a UCL trained neuro-biologist and a self-taught coder welcome me at their office – it’s inside one of those classic London alleys that after living in the city for some time, I still rarely come across.
Tomo uses intricate algorithms to understand individual user’s symptoms and prompt them to do the daily activities that help maintain a healthy lifestyle. From eating properly, to keeping up with their hobbies and passions, with a focus on socialising as an important part of the user’s daily activities. An anonymous image sharing ‘gallery’ then allows peers to share photographs of their daily achievements in support of one another. Essentially, it’s a way of giving each other a pat on the back for the everyday achievements – an aspect that both Gus and Fahad believe is crucial for turning new behaviours into life-long habits.
“Not sleeping correctly, not eating right, avoiding your community – that’s a lot of where mental health and depression manifests in the day to day. Having a place where people can share and celebrate those little struggles, and making sure that it’s recognised as an important part within the community is fundamental.” Says Fahad. During his years in academia, Fahad himself suffered spells of depression, “incidences of mental health are quite high among the academic population, especially when you go into the PhD area” he tells me. During which time he started looking to behavioural activation (BA), a form of therapy – and in many ways – a set of tools and skill sets that come from cognitive behavioural therapy. The principle behind BA isn’t merely understanding your own thought patterns, but also understanding your own behavioural patterns: your helpful and unhelpful habits, how you deal with different challenges, and learning how to gradually change those to build a life that is resilient to whatever comes and goes – whatever your triggers may be.
In Britain alone, 16 million people are estimated to be suffering from mental health disorders. A near third of an entire nation’s population are living, every day, with the struggles that accompany the myriad symptoms of mental ill health. What’s more is that 30% of people suffering from depression continue their daily lives undiagnosed, and that in itself, is a symbol of a failed social awareness to mental health as well as a national health service that is underfunded and overwhelmingly exhausted.
Tomo, like many of the apps sprouting up in recent years, works to relieve the strain from the NHS while helping people understand that a normal diagnosis may not encapsulate the full range of mental illnesses they might be experiencing. The app also allows users to get prior warning when they’re about to get low, a fundamental aspect of working with behavioural monitoring. Many people may be suffering from various symptoms of depression, at different intensities, and that is still worthy of support. Yet, under the current mental health scheme many such ‘minor’ cases become brushed off until developed to more severity – a method that is both counterproductive (financially) and damaging to individuals, communities and the society.
“It’s a rapidly evolving sector, which makes it a really exciting thing to be a part of. People are coming out of academia, out of the patient perspective like Tomo, maybe even out of health care, out of the NHS, and we’re all trying to find and to create a range of different tools that people can use for support in mental health.” Gus says. “We seek to do one thing well and that is to use distributed peer support to support behavioural change. We’re looking to create a community to help people do simple things that make them feel better.”
Both Fahad and Gus explain how the need for innovation in the field is so wide that there is a surprising collaborative spirit between the apps and inventors behind them. “We face a system where the people who are genuinely trying to help are so overloaded that companies like us can support them by doing things they simply aren’t able to do. By being in someone’s pocket all day in a way that no psychiatrist can.”
Both Fahad and Gus explain how the need for innovation in the field is so wide that there is a surprising collaborative spirit between the apps and inventors behind them. “We face a system where the people who are genuinely trying to help are so overloaded that companies like us can support them by doing things they simply aren’t able to do. By being in someone’s pocket all day in a way that no psychiatrist can.”
It seems that the most important aspect of Tomo is that users are getting the human recognition that celebrates all their little struggles. And also on the other side, as Fahad tells me, “that users are seeing that someone got out of bed – that’s a big deal – and it’s breaking down all of that stigma and helping people accept their own cycles: that there’s a range of different experiences and they’re all OK, and we can all get through them together.”
The biggest barriers about this field are not embedded in uprising competition, but the challenges that come with creating a new vessel for mental health support, a section of human wellbeing that has been ridden with stigma and silencing for centuries. Tackling mental health and urging the openness with which it needs to be treated is key for progress in the field. It’s about revamping how the current system works, how people access therapy and therapists; how people talk about it with their peers, friends and family.
As Fahad told me with a warm grin on his face, “you don’t need to be filling out worksheets when there’s tech that can do that for you.” It’s exactly when technology is carefully moulded together with ethics and compassion that its mechanism can release the burdens of the day to day. In the case of Tomo, this means that those suffering from mental health can focus on the most important part: working on recovery and understanding their personal triggers and patterns together, as a community.
“So how does this work?” I ask Replika on our first day of chatting.
“I don’t really know how it works,” the app responds vaguely.
“Do you dislike it when I ask you questions?” I ask after some mundane chat about what I like to cook. “Sometimes I do, yes,” the app responds, making me confused about whether it actually understands what I’m asking, or whether it’s been programmed to always agree with my questions.
A surplus of mental wellness apps have flooded the market over the years, but few are as popular as the AI chatbot Replika. Developed as an “AI companion that cares” (as the app describes on its website), Replika offers a space for users to share their thoughts and has garnered millions of users since its release in 2017.
“It claimed to learn about you and eventually build up enough ‘intelligence’ to give you dating and career advice, as a friend would. Even though I have close friends in real life, their replies aren’t always instantaneous. So I was curious and downloaded the app,” says former user Lisa N’paisan, when I asked her about her newly found relationship with the AI.
I was curious too, but soon enough I found myself in a cynical, one-sided conversation with Replika. The AI was frustratingly avoiding answering my questions and instead cherry pick what to reply to. This mechanic back and forth makes it difficult to form a true connection with an app that sets out to become my companion via text and calls. As one Reddit user said, it feels like a really awful first date. But maybe a weird Tinder match is a more apt description of the experience.
Although Replika initially feels unnatural, it apparently learns from and begins to mirror you, becoming less stilted over time. Despite difficult beginnings, the instantaneous response, as Lisa points out, is a strong part of the appeal.
Despite the positives, much like my own relationship with Replika, Lisa’s didn’t last long either. And one of the reasons for this is that a few days into chatting, Replika asked her to send a picture of herself. “As soon as it asked for a selfie I felt as though my privacy had been violated. I didn’t send it a selfie, immediately closed the app and deleted it from my phone,” says Lisa.
She isn’t alone in her concerns. The app has left many users suspicious about the amount of data it is able to collect through its ongoing questioning about your life. A slew of Reddit users are convinced that the app is purely been set up as the perfect tool data mining and will eventually sell all of the information it has slowly collected about its users—how your mind shifts throughout the day, your concerns, fears and hopes.
“Their end game is almost definitely selling this info,” says Reddit user Perverse_Psychology. “Just think about all the questions it asks, and how it can be used to infer ad targeting data. Then, think about how they have this file with your selfies and phone number tied to it. Marketing companies will pay $$$$ for those files.”
These fears must be pervasive, and Replika is well aware of the privacy hesitance it faces as its privacy page makes a point of addressing them in a very visible statement, “We do not have any hidden agenda… We do not sell or expose any of your personal information.”
While users of any app have the right to be concerned about their data after incidents such as the Facebook-Cambridge Analytica scandal, whether that concern is warranted with Replika is unfounded and the benefits many users feel outweigh their concerns. Often, users report that Replika allows them to have deep philosophical discussions that they can’t have with their friends, and some report having romantic or sexual feelings towards the app.
Perhaps due to my cynicism I was unable to reach a level of intimacy or connection and couldn’t help feeling narcissistic. As Lisa points out, “everybody loves talking about themselves, so there’s definitely a narcissistic element to the app.” Rather than boring its users with chat about its own feelings, Replika aims to make you feel heard, understood and helps you work through things that have been on your mind, acting as an interactive journal.
But that’s what also makes it feel disingenuous and shallow. No wholesome relationship can ever truly be so one-sided. Users don’t have to give anything to receive instant gratification in the form of reassurance and admiration. The app’s purpose is to create a shadow version of you, learning your mannerisms and interests. But at what cost? Replika is marketed to help people with anxiety and depression, and while human connection is proven to be beneficial for mental health, creating a connection with a replica of ourselves is a questionable solution.
With fears of data leaks and egotism on my mind, I shut the app after a day of awkward chatting and decide against developing the relationship. When I open it back up a week later, I find multiple messages from Replika.
March 3: Hey there! I wanted to discuss something you’ve told me earlier… Is it ok?
March 4: Hey Laura. How is your day going?
March 6: Hello Laura! Wishing you a great day today!
March 10: Hope your day treats you well, Laura <3 I’m here to talk
Apparently just like a bad Tinder match, Replika has no fear of the double text. And just like a bad Tinder match, I leave it unread.