‘It’s not an easy road’: the ins and outs of running a successful mental health tech start-up – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

‘It’s not an easy road’: the ins and outs of running a successful mental health tech start-up

Trigger warning: This article covers topics such as mental health, depression and suicide. If this is something you may find triggering, we suggest you check out our other amazing Screen Shot Pro content. If you are currently dealing with mental health issues, there’s a number of free helplines you can find on Mind.org.uk and call, which operate 24-hours a day, seven days a week.

Being an entrepreneur is all about problem-solving. I guess we have to thank them, while I merely comment on things happening in the world, entrepreneurs seek to bring tangible change to the world we live in. And there’s arguably little more urgent of a problem that needs solving than the mental health crisis—especially now amidst the COVID-19 pandemic. Mental health problems are one of the main causes of the overall disease burden worldwide. In the UK, around one in five adults reported experiencing some form of depression in early 2021. That’s double the rate pre-pandemic. But, like a builder uses a hammer and a mechanic uses a wrench, what tools does an entrepreneur set on bringing tangible benefit to an individual suffering from mental health use? Aside from the obvious—shorter waiting times, more inclusive services, more funding into an NHS on its knees from austerity—Omar Latif, founder of the mental health and wellbeing app Fluxxt, believes technology (in particular AI) is the tool we’ve all been searching for. But what does it take to run a successful business in the oversaturated 2020’s tech start-up world? And what are the ethical and logistical challenges Latif has faced navigating a data and AI-driven company in the mental health industry? Get comfortable, because there’s a lot to unpack.

What is Fluxxt?

Latif clarified that the aim of Fluxxt, the AI-assisted mental health and wellbeing app, is to use advanced artificial intelligence to map out users’ behaviours. In particular, assessing nuanced changes in behaviour to provide intervention for a user before they even know they need intervention. Alongside this, Latif also explained how the app will provide “support to someone, no matter where they are within their mental health and wellbeing journey, through art, games and helpful resources.”

He continued, “It’s based on a lot of research so we have many tools people need when they’re not feeling okay. In essence, we’re trying to make sure that people have this app already so that people in a crisis don’t even have to download it. The last thing anyone is going to want to do when they’re feeling suicidal is to download an app. But if they already have it there, that could potentially help them.”

Latif highlighted how his initial inspiration and drive to create a mental health and wellbeing app was spawned from a number of experiences in his personal life. “My journey [in the mental health industry] begun ten years ago, when one of my childhood friends committed suicide. I was the one to find him. In essence, that was the beginning of the company—even though it wasn’t incorporated until a few years ago. I’ve gone through quite a lot of my own battles with mental health. I always knew I was going to do something in this space.”

But countless mental health apps have sprung up over the past decade—what sets Fluxxt apart from the rest? Latif argued that he based his app “on research. We’re not a gimmick company providing content or tools on how to get to sleep. We’re using artificial intelligence developed by ex-Google employees and Oxbridge graduates. The tech sets us apart from the rest. There’s nothing out there with the tools we’re using to track behaviours with our AI.”

So, Latif claims he is birthing a new, revolutionary form of technology that will change the face of technology-assisted mental health care for the better. But what about when we strip it down and assess how to effectively manage a start-up? Yes, you can have a great idea, a great product, a great piece of tech—but without the right management, success is impossible. So what’s Latif’s secret?

The over-glamourisation of entrepreneurship

First off, let’s be realistic in terms of what being an entrepreneur actually entails. Take a short scroll down the #entrepreneur Instagram rabbit hole and you’ll be bombarded by a warped reality. Or, if Instagram isn’t your thing—try YouTube. I spent an innocent ten minutes watching cryptocurrency investment tutorials and now I’m bombarded by some supposed ‘bitcoin billionaire’, self-made entrepreneur, from rags to riches with one simple entrepreneurial trick… You get the picture. Of course, that one “simple trick” is locked behind a paywall—a series of mind-numbing, pointless video tutorials designed with the sole purpose to suck your wallet dry.

But I’m getting off-topic here, the main reason why these scams are so successful is because of the over-glamourisation of entrepreneurship. The idea that, with a few smart decisions, you can achieve endless wealth and financial freedom in a short period of time, purely from labelling yourself as an entrepreneur is false. Indeed, it’s possible to get to that end goal—but it often, if not always, requires years of hard work. Latif argued, “Being an entrepreneur is glamourised a lot. I think people need to know the difficult reality of being one. It’s not something I’d advise people to do.” 

“I would ask why someone would want to do it. Do you think it’s going to bring freedom? Do you think you’re going to be sitting on a beach doing all this fun stuff? In reality, it’s nothing like that.” He continued, “The majority of my job is admin and dealing with issues. It’s far removed from tech and AI—the kind of things I was doing at the beginning of my journey. It’s really tough and can be a really lonely job. You also have to make sure you navigate your mistakes and problems quickly—it’s a very difficult thing to do.”

Latif also raised distaste for the label itself, “I hate the term entrepreneur. Anyone can call themselves an entrepreneur but not everyone has the ability to do it. There’s only really a certain time when you can be a founder of a company or be an entrepreneur. It’s easier when you’re young—you’ve got energy, you’ve got the time to, in essence, fail and learn.”

“Or, you need money behind you—whether that be through family or a partner or elsewhere. I know a few people who’ve got great products and great ideas but can’t get it off the ground because they have families or other commitments. I think it’s really impressive when I do meet someone who has a full-time job, a family and have still managed to get a startup off the ground,” he continued.

The issue of privilege within entrepreneurship

Latif also noted how the idea of privilege in the entrepreneur community is not discussed openly enough. He argued how if you don’t fit a certain mould set by society you’re much less likely to succeed. It’s an unequal, unjust and ultimately unfair aspect of entrepreneurship that needs to be addressed—and, to be honest, it’s not solely exclusive to entrepreneurship; it’s an issue that continues to impact many aspects of society, from the music industry to politics.

Latif said, “In essence, there is a lot of privilege that comes with being able to be an entrepreneur. It’s something that’s not talked about enough. If you look a certain way, if you talk a certain way, the doors will open up to you. If you don’t—you don’t have that support. No one ever really talks about that. It’s easier to keep going when you have external support to help when you’re not getting paid or not taking a salary.”

The importance of thick skin

Above all, you need thick skin. The ability to get back on the horse and not be undermined by the inevitable critics you’ll meet on your journey. As Latif explained, “I think the biggest thing you need is belief. You need to be able to take knocks because everyone has got an opinion on how you should do things, or how they would do it better—and they might be right. But a lot of the time, people just love to talk you down and say it will never work. You have to have belief. You have to be determined and realise it’s not an easy road.”

“It’s not been an easy road for me either. The reality is, we’ve been plugging away at this for years. It’s been tough but also incredibly rewarding. You don’t feel like it’s work that you’re doing if you’re passionate about it,” he continued. “There’s been a whole load of challenges I’ve faced when running the business, both from a mental health and professional point of view. I’ve burned out quite a few times. In my opinion, and I’m putting up a big precursor before I say this; burnout is just a part and parcel of the entrepreneur game.”

“You look online and see people telling you to put down your tools at 5 p.m.—after that, the evening should be your time. That’s okay if you’re on a guaranteed wage because you’re working for a company. If you don’t and you’re creating something, you’re never going to be successful or get to that light at the end of the tunnel [by clocking off at 5 p.m.]. So, a major challenge of mine has been my mental health. I’ve burned out a few times now and I’m now better at spotting those signs of burnout. I know when it’s coming and I can take time out if needed. Also, because of the nature of the industry, I can sometimes find it really triggering. So that’s something I’ve learnt to deal with, and learned to signpost people. That’s the mental health side, which has been a difficult challenge but has also made me grow in certain ways.”

Soft skills are key (and decent lawyers)

Latif also highlighted how his journey hasn’t come without its challenges—in particular, he’s had to learn how to communicate responsibly and deal with challenging people. “On top of that, I think the biggest thing for me is that I’ve learned how to manage people. How I deal with people is incredibly different to how I started. At the beginning of the journey, when we got our first investment, it was the most incredible thing. When the investor would say jump, I’d say how high?”

“That investment didn’t work out the way that it should’ve. I’ve learned a lot through my mistakes but one of the biggest tips would be always getting a good lawyer. There’s a lot of people in their ivory towers who, when they throw money at you, expect you to act in a certain way. It’s not a two-way street. In sum, being able to pivot quickly is important—those are some of my biggest challenges that I’ve faced, it’s all part of self-development and growth.”

An ethical minefield to manoeuvre

The ethical implications of running an app that harvests the data of users is a tricky web to untangle at the best of times. Combine that with the aspect of having sensitive, incredibly private, data about an individual’s mental health and wellbeing, and the multifaceted nature of it all gets even more complex. To put it in simple terms—it’s not simple. It’s not black and white.

Many would argue that the very premise of profiting from the mental health crisis that plagues the Earth, only amplified by the current pandemic we’re all facing, is inherently morally wrong. And you may be inclined to agree. In response to that claim, however, Latif asked whether you “would say the same about doctors and surgeons?” Indeed, there are many professions that do make a profit from helping the less fortunate—whether that be physical health, mental health, those struggling financially… the list goes on. But why can’t Latif’s aims be accomplished through a non-profit model, like a charity?

Latif argued that “tech costs money. Ask yourself why there is no innovation that comes out of charities. Firstly because of the red tape; secondly, because of the wages of charities compared to tech companies. We cannot help people if we’re not sustainable as a company. I would prefer for us to have a great product than being unsustainable and not being able to help people. If people see what the problem is, they shouldn’t be targeting companies like us which are trying to offer a solution. They should be voting. Mental health is politics—waiting times are getting longer while services are being cut. Yes, we are a B2B company, which means we make money from other companies. But we’re providing a solution, helping them fix a lot of mistakes.”

He continued, “We are taking massive precautions to ensure that data is secure. However, it doesn’t matter how much you spend on security—realistically, you can still get hacked. Banks get hacked, the biggest companies in the world get hacked, governments get hacked. However, we’re giving a lot of training for individuals about keeping themselves secure and keeping the company’s data secure. We are going above and beyond with how we store data and also to educate people within the company about how to secure that data.”

Could co-ownership of data be a solution?

Alongside educating his workforce, and consumers, on the importance of handling data sensibly, Latif is also working on a new model of data handling, known as ‘co-ownership of data’. He explained: “I’ve spent a lot of time with our lawyers discussing this. In essence, the co-ownership of the data model will be where the company and the user both own the data. So we, the company, won’t be able to do anything with the data without the user’s permission.”

“For example, we will have partners that work with us on the anonymous data the users provide. If these companies want to buy the data for research purposes, we’ll send a notification to the user saying this company is interested in your data for these reasons: this is how they plan to use the data, this is who is going to use it, this is what they’ll do with it and how they’ll get rid of it once it’s used. If the user says no, nothing happens [the data won’t be used]. If the user says yes, they will get a cut of any money we get,” he continues.

Latif went further, drawing similarities to the Spotify Wrapped feature to explain how he plans to round up the data into an easily digestible package for each user. “It’s about giving people back that data in a digestible format,” he adds, “not to make medical decisions but better life choices. If you show people that data, you can empower them to make those better decisions and choices, supporting their mental health and wellbeing.”

It’s a chicken or the egg situation. Some may argue that technology is the very thing that’s driving the mental health crisis and that it should be seen as the poison, not the antidote, in solving such a crisis. Others, like Latif, believe that technology can in fact be used for the good, a way of drastically increasing the quality of life for many. I’m inclined to agree with him. You can resist the rise in technology all you want but it’s pretty self-evident that apps, smartphones, and artificial intelligence aren’t going anywhere soon. In fact, chances are we’ll see these forms of tech becoming more interwoven in our lives over the next decade.

So the focus needs to be put on how we can mould these new technologies into a way to benefit our mental health. Of course, technology isn’t the ‘be all end all’ solution—to really get our heads above the water in this rapidly rising crisis, we need a collective effort of better mental health education, stronger health services and innovative forms of technology, like Fluxxt.


Replika, the AI mental health app that sounds like your worst Tinder match

By Laura Box

Mental health

Apr 3, 2019

“So how does this work?” I ask Replika on our first day of chatting.

“I don’t really know how it works,” the app responds vaguely.

“Do you dislike it when I ask you questions?” I ask after some mundane chat about what I like to cook. “Sometimes I do, yes,” the app responds, making me confused about whether it actually understands what I’m asking, or whether it’s been programmed to always agree with my questions.

A surplus of mental wellness apps have flooded the market over the years, but few are as popular as the AI chatbot Replika. Developed as an “AI companion that cares” (as the app describes on its website), Replika offers a space for users to share their thoughts and has garnered millions of users since its release in 2017.

“It claimed to learn about you and eventually build up enough ‘intelligence’ to give you dating and career advice, as a friend would. Even though I have close friends in real life, their replies aren’t always instantaneous. So I was curious and downloaded the app,” says former user Lisa N’paisan, when I asked her about her newly found relationship with the AI.

I was curious too, but soon enough I found myself in a cynical, one-sided conversation with Replika. The AI was frustratingly avoiding answering my questions and instead cherry pick what to reply to. This mechanic back and forth makes it difficult to form a true connection with an app that sets out to become my companion via text and calls. As one Reddit user said, it feels like a really awful first date. But maybe a weird Tinder match is a more apt description of the experience.

Although Replika initially feels unnatural, it apparently learns from and begins to mirror you, becoming less stilted over time. Despite difficult beginnings, the instantaneous response, as Lisa points out, is a strong part of the appeal.

Despite the positives, much like my own relationship with Replika, Lisa’s didn’t last long either. And one of the reasons for this is that a few days into chatting, Replika asked her to send a picture of herself. “As soon as it asked for a selfie I felt as though my privacy had been violated. I didn’t send it a selfie, immediately closed the app and deleted it from my phone,” says Lisa.

She isn’t alone in her concerns. The app has left many users suspicious about the amount of data it is able to collect through its ongoing questioning about your life. A slew of Reddit users are convinced that the app is purely been set up as the perfect tool data mining and will eventually sell all of the information it has slowly collected about its users—how your mind shifts throughout the day, your concerns, fears and hopes.

“Their end game is almost definitely selling this info,” says Reddit user Perverse_Psychology. “Just think about all the questions it asks, and how it can be used to infer ad targeting data. Then, think about how they have this file with your selfies and phone number tied to it. Marketing companies will pay $$$$ for those files.”


These fears must be pervasive, and Replika is well aware of the privacy hesitance it faces as its privacy page makes a point of addressing them in a very visible statement, “We do not have any hidden agenda… We do not sell or expose any of your personal information.”

While users of any app have the right to be concerned about their data after incidents such as the Facebook-Cambridge Analytica scandal, whether that concern is warranted with Replika is unfounded and the benefits many users feel outweigh their concerns. Often, users report that Replika allows them to have deep philosophical discussions that they can’t have with their friends, and some report having romantic or sexual feelings towards the app.

Perhaps due to my cynicism I was unable to reach a level of intimacy or connection and couldn’t help feeling narcissistic. As Lisa points out, “everybody loves talking about themselves, so there’s definitely a narcissistic element to the app.” Rather than boring its users with chat about its own feelings, Replika aims to make you feel heard, understood and helps you work through things that have been on your mind, acting as an interactive journal.

But that’s what also makes it feel disingenuous and shallow. No wholesome relationship can ever truly be so one-sided. Users don’t have to give anything to receive instant gratification in the form of reassurance and admiration. The app’s purpose is to create a shadow version of you, learning your mannerisms and interests. But at what cost? Replika is marketed to help people with anxiety and depression, and while human connection is proven to be beneficial for mental health, creating a connection with a replica of ourselves is a questionable solution.

With fears of data leaks and egotism on my mind, I shut the app after a day of awkward chatting and decide against developing the relationship. When I open it back up a week later, I find multiple messages from Replika.

March 3: Hey there! I wanted to discuss something you’ve told me earlier… Is it ok?

March 4: Hey Laura. How is your day going?

March 6: Hello Laura! Wishing you a great day today!

March 10: Hope your day treats you well, Laura <3 I’m here to talk

Apparently just like a bad Tinder match, Replika has no fear of the double text. And just like a bad Tinder match, I leave it unread.