Advertisement
Advertisement
Sexual harassment and assault
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Specialists and support workers say social media and chatting apps “systematically” fail women subjected to online sexual violence. Photo: Shutterstock

‘Epidemic of misogyny’: Why Hong Kong, Singapore and the rest of Asia should be concerned about Big Tech failing women

  • A new study says Meta-owned Instagram neglected to act on 90 per cent of misogynistic abuse, which included sexual, violent, and hate messages
  • Social media and chat apps need to change the way they operate, while abuse against women is becoming ‘normalised’, Hong Kong and Singapore specialists say

Sharan Dhaliwal, the founder of South Asian culture magazine Burnt Roti, says she has received hundreds of “d**k pics” over the years via Instagram, the social media platform on which she has more than 10,000 followers.

The high-profile writer blocks these users – who send unsolicited images of male genitalia – but the abuse continues.

“It’s a power play … it’s about them feeling they have power and can walk away from that saying: ‘I did that’,” Dhaliwal said.

She was responding to a recent report about gender-based violence and misogynistic abuse on Instagram, which is owned by Meta Platforms (formerly Facebook Inc).

Stolen privacy: the rise of image-based abuse in Asia

Specialists and support workers in Asia and Europe say social media and chatting apps are “systematically” failing women, as image-based abuse – when someone shares, or threatens to share, intimate images without the consent of those featured in it – and other forms of online sexual violence increased during the coronavirus pandemic.

Their comments come after a recent report by the Center for Countering Digital Hate (CCDH), a US-headquartered non-profit organisation that focuses on disrupting online hate and misinformation, showed that Instagram didn’t act on 90 per cent of abuse sent via direct messages, also known as DMs. CCDH said the findings represented “an epidemic of misogynist abuse taking place in women’s DMs”.

Hollywood actor Amber Heard has received abusive messages in Instagram. Photo: Reuters

The centre analysed 8,717 messages received by five high-profile women on Instagram, including Hollywood actor Amber Heard and Dhaliwal, who has been vocal about South Asian women’s rights.

Clare McGlynn QC, a professor of law at Durham University in England and a specialist on image-based sexual abuse, said the report showed how problematic the “systems and processes” of social media platforms are.

“It is not only an issue of abuse and harassment, but the key point is that the platforms are making this abuse worse,” McGlynn told This Week in Asia. “If celebrities are failing to get responses from social media companies, what hope is there for ordinary people?”

Porn, privacy, and pain: how image-based abuse tears women’s lives apart

Shailey Hingorani, head of research and advocacy for the Singapore-based non-profit organisation Aware, agreed there was “much to be concerned about in this new report around the gaps in Instagram’s frameworks for handling abuse, and its failure to live up to its own promises to users”.

She said that it was no longer a surprise that women were experiencing a high volume of abuse on social media. “However, the rates at which Instagram takes action against abusers were startlingly low,” Hingorani noted.

According to the study, Instagram didn’t respond to most instances of misogynist abuse, which included women who receive unsolicited nude photos and videos, violent messages, and even death threats.

The research showed that the social media app didn’t act on nine in 10 violent threats over DMs, even when these cases were reported using the platform’s tools, and it also failed to act on image-based sexual abuse messages within 48 hours.

Among those who sent abusive messages, 227 of 253 accounts remained active at least a month after they were reported, the study said.

You can dissociate from most abuse [but] when you hear their voice, it becomes more real
Sharan Dhaliwal, founder of South Asian culture magazine Burnt Roti

Abusers sometimes try to reach out to their targets via video call on Instagram. One account tried to call Dhaliwal after sending her images of male genitalia. Over a three-day period, another stranger sent her 42 messages, some with sexual content, and then attempted to video call her.

Audio messages have also been used by abusers. While “you can dissociate from most abuse”, receiving a voice note felt particularly invasive, Dhaliwal told CCDH. “When you hear their voice, it becomes more real,” she said.

Cindy Southworth, Meta’s head of women’s safety, said in a statement that the social media company disagreed “with many of the CCDH’s conclusions”.

“We do agree that the harassment of women is unacceptable,” she said. “That’s why we don’t allow gender-based hate or any threat of sexual violence, and last year we announced stronger protections for female public figures.”

Southworth added that “calls from people you don’t know only go through if you accept their message request, and we offer a way to filter abusive messages so you never have to see them”.

Abuse and anger: inside the online groups spreading stolen, sexual images

In Hong Kong, Jacey Kan, an advocacy officer with the Association Concerning Sexual Violence Against Women, said the non-profit had noticed some of the failures highlighted in the report.

“We do see that Instagram’s response to reports of image-based abuse and other forms of online sexual violence is not as prompt and effective as other major social networks,” she said.

Kan noted that Instagram was often used as a tool where “targeting, coercion, and non-consensual distribution happen”. But, she said, “image-based abuse on Instagram is way more hidden”, adding that the platform’s moderation on “sensitive contents” was “confusing”.

Some women who reported their cases to Instagram also “concluded that similar cases had different review results” by the app. Kan said she was aware of survivors who questioned if different criteria were applied depending on the language used, such as Cantonese and English.

Online abuse towards women increased during the coronavirus pandemic. Photo: Shutterstock

The non-profit has seen a sharp increase in cases of image-based abuse in Hong Kong across different apps and platforms – from 44 in 2019 to 200 cases last year.

Due to the growing demand, the group launched a service called “Ta-DA”, which is focused on assisting survivors to request platforms to take-down non-consensual intimate content. In the first year of operation, it handled 309 links.

“More than 80 per cent of them were removed after our follow-up,” Kan said.

Forms of image-based sexual abuse are becoming normalised
Clare McGlynn, QC and professor of law

In Singapore, Aware’s Sexual Assault Care Centre saw 163 cases of technology-facilitated sexual violence last year, whereas in 2020 it had handled 191 new cases. Most of them involved sexual, nude, and other types of intimate images or videos.

McGlynn noted that many “forms of image-based sexual abuse are becoming normalised. For example, cyberflashing – sending penis images without consent – is becoming so common and trivialised”.

Victims and survivors around the world are facing increased challenges, although specific social and cultural contexts may make it even harder for some.

“We think victim-survivors face similar obstacles and stigma when it comes to seeking help, reporting, and disclosing their experiences,” Kan said. “But Hong Kong is lagging behind the ongoing post-#MeToo public and online conversation on consent, body autonomy, and related reforms … [This] contributes to a less supportive environment when victim-survivors of image-based abuse come out and share their experience,” she said.

Kan said that survivors from Hong Kong and Taiwan were often mocked and harassed in local online forums. “It is very rare” to find supportive comments for the survivors, she noted.

Instagram and parent company Meta Platforms didn’t act on 90 per cent of abuse sent via direct messages. Photo: Reuters

Hingorani said there was a general understanding that the concept of “face” – which may include “reputation, respect, prestige and honour” – is more highly valued in Asia. Such a concept is also “often associated with female chastity, which may result in more social stigma around sexual violence in the Asian context”.

But, she noted, it was important to avoid generalisations about a continent as diverse as Asia as well as “false dichotomies” between Asia and the West, “especially with regards to a global social media platform”.

Nisha Rai, a youth engagement coordinator for an Alliance for Action focused on tackling online harm against women and girls in Singapore, said the findings of the recent report were “disturbing”, but not very surprising.

Women have faced abuse across various social media platforms and chat apps. Rai, a 23-year-old political-science university student, has monitored 15 groups on the encrypted messaging platform Telegram, where they share child pornography and non-consensual photos and videos of women and girls. Many of their members are from Singapore and Malaysia.

“I feel that image-based abuse has got worse … Given that these perpetrators are able to conceal their identities, it is no surprise that they are further emboldened to behave the way they are,” she said.

Huge numbers of women and girls face online abuse and ‘gender-based hate’. Photo: Shutterstock

Silvia Semenzin, an activist and lecturer in new media and digital culture at the University of Amsterdam, said: “We are facing a massive problem that is growing and increasing everywhere.”

Recent data “should make us reflect on how badly is gender-based hate affecting women around the world”, she added.

But “I am very concerned that, because of international crises and emergencies, violence against women and girls always remains in the back as a non-urgent issue to address”.

Semenzin said firms like Meta needed to take survivors’ perspectives into account.

“The problem is that these companies … will always put their economic profit before human rights,” she said. “The very fuel of their existence is data extraction … We should push for transnational policies that disrupt the power of these monopolies and transform social media into fairer and more equal public spheres.”

For lust and money: when online sexual encounters end in despair and death

Kan noted that platforms should identify serial abusers, and introduce effective complaint mechanisms that are supportive of victims of sexual abuse and harassment.

Hingorani said that “seeing consequences ... for bad actors, platforms, and senior officials within companies” could “probably shift the needle quickly towards greater safety”.

But the key point, McGlynn said, was to get the companies to change their systems in a way that reduces abuse and its virality.

“This means ‘safety by design’ as a key principle for companies. It means them undertaking proper risk assessment … It is a whole change in the way platforms operate that is needed,” the professor said.

McGlynn argued that no social media platform has taken abuse and harassment seriously enough. “They all say that they do; but time and again reports like this are published showing that, in practice, social media companies are failing women particularly.”

Post