Chayn, the survivor-led platform that empowers women in abusive relationships – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Chayn, the survivor-led platform that empowers women in abusive relationships

When I was growing up I had a friend who was a victim of domestic violencefirst from her father and later from her boyfriend. It seemed as though she could not escape this cycle of abuse that was inflicted on her by those with whom she shared a life. She was young and struggled to break up with the partner who was hurting her, and I, as her closest friend, often felt powerless and incapable of helping. Throughout the years, I met more women going through similar situations, and each time I felt like I didn’t have the tools to support them while they felt like there was no one they could reach out to for help.

Hera Hussain, founder of the open-source organisation Chayn, went through a similar experience: after seeing two friends struggling to flee their abusive marriages due to the lack of information and support networks, she founded Chayn in 2013, a global volunteer network addressing gender-based violence by creating intersectional survivor-led resources online. Since then, the organisation has served as an online guide for women on how to build domestic-abuse cases without a lawyer and to recognise and deal with the symptoms of a toxic relationship.

mental_health-01

“We’ve found that in most contexts Chayn operates in, the very architecture of information is biased because it is written from a male perspective. Basic information on divorce and child custody laws are missing in many countries, and if found, it is often riddled with bad information.” Hussain told Screen Shot when asked about the premise of the project. By using open-source knowledge, Chayn is filling a gapit now reaches and empowers vulnerable women who may not leave the house very often, but will likely have a smartphone with internet access.

The project provides guides translated in multiple languages, uploaded by domestic violence survivors and volunteers from across the world. It also uses GIFs, catchy graphics, podcasts, op-eds, and even a chatbot. “We help women find information to answer questions such as, ‘How do I get divorced?’, ‘What are my rights under the child custody laws in my country?’, ‘Am I depressed?’, ‘Do I have anxiety?’, and ‘Do I have PTSD?’ to questions like ‘How can I build up my CV?’.” Hussain explained to us that the aim of Chayn is to turn firsthand experiences into empowering knowledge that could eventually provide other abused women with psychological, cultural, and legal support. Building an inclusive technology is pivotal for Hussain, as it allows—regardless of education, class and race backgrounds—an increasing number of women to crowd-source knowledge on domestic violence and collaborate on the creation of a platform whose informative resources work under Creative Common (CC) licenses.

mental_health-12

Despite the various successes of the platform, which has reached over 200 thousand people since its start, not everyone agrees on the reliability of the volunteer-run network. According to Hussain, numerous people (mainly men), have criticised the project because it provides non-professional advice on sensitive and complex matters. But the positive response that Chayn is receiving clearly shows the shift in direction when it comes to the distribution of knowledge on gender-based violence worldwide. 70 percent of the 400 volunteers who are currently contributing to the organisation are survivors of violence themselves, thus confirming that the platform is generating measurable benefits to the people who sought help.

When asked about the potential of the internet and technology in supporting women, Hera is far from being naive but sees the potential of the web in giving power back to those who need it the most. “The online world presents women with both obstacles and new opportunities. As the gap in access decreases, women are demanding their place as both creators and consumers of tech. With the chance to reach a wide audience on a shoestring budget, tech enables women to understand what is happening to them and what to do about it. From finding sources of help to escape abuse, tackle mental health issues, find refuge to educate themselves and finding ways to earn money—there is no limit to how we can use the appropriate technology to enable women to become creators of their own fate.”

mental_health-07

As Hera told Screen Shot, Chayn is currently starting to build a salaried team and making the transition from a volunteer-run organisation to a hybrid model able to support the ambitions of its community. It is also about to launch another digital service called Soul Medicine, which will enable women to sign up to receive bite-sized versions of its content at a time that is safe and convenient for them.

Chayn recognises that in order to overcome the emotional attachment and the complex psychological dynamics that are inevitably linked to toxic and abusive relationshipsand to eventually secure a long-term separation from a violent partner—it is necessary to have access to legal and psychological knowledge, as well as a support system. As I witness this digital-based project grow I can’t help but realise how much a platform like Chayn could have helped me and my friend a few years ago, by showing her how to navigate a situation that was way too complex to overcome alone. For this reason and many more, I am thankful that Chayn is now out there.

Opinion

Amnesty International report reveals that Twitter is a ‘toxic place’ for women

By Yair Oded


Social media

Dec 21, 2018

A recent study, titled Troll Patrol project, compiled jointly by Amnesty International and Element AI, a Canadian AI software firm, finds that black female journalists and politicians are 84 percent more likely to be the target of hate speech on Twitter. The study, carried out with the support of thousands of volunteers, examined roughly 228,000 tweets sent to 778 women politicians and journalists in the U.S. and U.K. in 2017. The report’s disturbing findings have sparked an international uproar and a barrage of criticism against the social media giant, which apparently fails to curb hate speech on its platform.

The study found that a total of 1.1 million dehumanising tweets were sent to the women examined, which is the equivalent of one every 30 seconds, and that 7.1 percent of all tweets sent to these women were abusive. Amnesty International regards such trolling as a violation of these women’s human rights, stating that “Our conclusion is that online abuse [works] against the freedom of expression for women because it gets them to withdraw, it gets them to limit their conversations and sometimes to leave the platform altogether… But we never really knew how big a problem was because Twitter holds all the data. Every time we ask for reports, they’re very vague, telling us that they’re taking some small steps. … Because they didn’t give us the data, we had to do it ourselves.”

Amnesty was soon joined by public figures, politicians, and organisations who criticised Twitter’s incompetent mechanism of removing abusive content and the company’s failure to properly adopt its recent policy revisions meant to strengthen monitoring of dangerous and offensive tweets. Twitter’s shares reportedly took a 12 percent nosedive yesterday, after being referred to as “toxic”, “uninvestable”, and “the Harvey Weinstein of social media” by an influential research firm called Citron. “The hate on Twitter is real and the company is not taking proper steps to curb the problem,” Citron said in a statement, adding that the company’s failure to “effectively tackle violence and abuse on the platform has a chilling effect on freedom of expression online.”

Similarly to Facebook, Twitter relies heavily on AI algorithms to spot and remove content deemed inappropriate, violent, or discriminatory. Yet, such machines often fail to pick up on hate speech that relies on context and is not easily discernible. A tweet like “Go back to the kitchen, where you belong”, for instance, will less likely be spotted by an AI machine than “all women are scum”.

Twitter, on its part, claims that ‘problematic speech’ is difficult to define and that often it’s hard to determine what counts as dehumanising. “I would note that the concept of ‘problematic’ content for the purposes of classifying content is one that warrants further discussion,” Vijaya Gadde, Twitter’s legal officer, said in a statement, adding that “We work hard to build globally enforceable rules and have begun consulting the public as part of the process.”

There is no doubt that social media companies such as Twitter must further develop their content monitoring tools, by fusing AI algorithms with, yes, air-breathing-real-world humans, who prove to still be indispensable and irreplaceable. This, therefore, becomes not only a technology issue but an HR and resource allocation one; Twitter and its social media buds should increase the size of its content control divisions until machines can effectively and truly master the art of reading comprehension.

Other than forcing Twitter to get tougher on hate speech, people must take a moment to truly reflect on who the report identifies as the primary target of trolling: female minorities who raise their voice publicly, whether as a U.K parliament member, U.S. Congresswoman or Senator, or a journalist. The challenge is then not only to remove content that abuses minority women and discourages them from remaining visible and active, but also to recognise that we live in a society that still aggressively tries to silence them.

As we do so, we must simultaneously explore who are the people behind the 1.1 million abusive tweets; what segments of the population do they belong to? Who, exactly, are those so terrified of the prospect of having minority women fight for their rights? Should we lean on commonly-held assumptions regarding their identity, or will a data about them reveal a more complicated story? All of these questions should be the subject of further research, without which we could never truly tackle the plague of racism and misogyny.