Ritumbra Manuvie is on a mission
Researching
FacebookHatebook
The Hutu massacres, apartheid, the Holocaust: history is full of grave instances of abetment. It’s something that needs to be taken very seriously, says Ritumbra Manuvie, law lecturer at the University College Groningen. That applies to social media like Facebook, too. And if Facebook won’t do it willingly, she says, it should be made to.
Facebook itself has already admitted to unwittingly endorsing incendiary posts. Since the 2020 US election, the company has hired tens of thousands of moderators to curb the spread of these messages. But these moderators mainly focus on the English-language market; according to Manuvie, they don’t do anything with posts in other languages.
Stateless Muslims
She witnessed the effects of hatefulness in the Indian state of Assam, where she was doing her PhD research on immigration caused by climate change. She knew the people there were plagued by floods. ‘They had nothing, not even their papers; they’d lost them to the water.’
The government decided to take a census that required people to show their papers. People were allowed to apply for new papers, but there was one caveat: Muslims were not. ‘All of a sudden, 1.9 million people were stateless.’ Manuvie was astonished at the new law. ‘How could they do something like this?’
There was a wave of posts about how brilliant this law was
At the same time, the Hindu community applauded the law. ‘It was evident on Facebook: there was a wave of posts about how brilliant it was. But I was looking at the situation through the eyes of an academic and I knew that it wasn’t. What was happening?’
It turned out that in the years previous, Facebook and other media saw an increase in negative political reporting on Muslims. ‘People were saying that Muslims had invaded our country and that we should deport them to Bangladesh.’
These messages originated with groups that were in favour of a Hindu ethnostate. Human rights organisations such as Amnesty accuse Narendra Modi’s government party of wanting this as well.
Cognitive bias
The change happened gradually over the course of a few years, says Manuvie. ‘People’s timelines start to change, and they see how their friends join far-right Hindu groups. They might start wondering what’s going on.’
It’s a case of cognitive bias, she explains. People might initially dismiss the posts as nonsense. ‘But the more you’re confronted with them, the easier it becomes to believe them. People start to make connections, and they start thinking there might be some truth to these posts.’
Later, The Wall Street Journal revealed that the head of policy of Facebook India, which was supposed to moderate posts like these, supported the Modi government and that she was instructing her employees not to do anything against Modi supporters. Facebook denied any instance of partiality.
While people in India took to the streets to protest against the new law, Manuvie and other Indian expats gathered in a pub in Brussels to discuss how to show their support. They discussed the inflammatory messages in the media and decided on creating a foundation that would do academic research into these posts. They named their foundation The London Story (TLS), after the pub where it was born. Perhaps facts could save the day.
Reported
Facebook is a big deal in India. While counts vary, there are hundreds of millions of Facebook users in the country. That’s more than in the United States. In fact, Facebook came really close to becoming the portal to the entire internet, just like in Myanmar. Facebook has now been sued for millions in that country, because it didn’t do anything to stop the inflammatory posts that preceded the Rohingya genocide in 2017.
The more you’re confronted with these messages, the easier it becomes to believe them
Using CrowdTangle, a Facebook tool for marketers and researchers, Manuvie and her team found more than six hundred Facebook pages of people railing against Muslims in India. They reported two hundred of these posts to Facebook. But they weren’t removed right away, says Manuvie, even though they clearly violated the rules Facebook claims it has. In one video, far-right religious leader Yati Narsinghanand calls for the extermination of all Muslims.
Facebook eventually removed that video, two weeks before it published its own report on the impact of the company on Indian society. Although report is perhaps a strong word: the company published a four-page summary. The rest was kept confidential, says Manuvie. ‘The summary was just a PR move.’
Covid
Then there was the investigation into disinformation on Facebook in the Netherlands in the run-up to the Lower House elections in 2021. On pages run by right-wing movements, the foundation found 938 messages that TLS says should have been removed. They broke Facebook’s own rules about Covid disinformation, says TLS. Some of them claimed the Covid vaccines contained microchips, or that Covid was no worse than the flu, and other QAnon-like conspiracy theories.
Interestingly enough, some of these messages had been posted by Dutch political party Forum voor Democratie, including advertising purchased by FVD, Manuvie found out. The party even spent more on Facebook advertising than any other government party.
One of the posts they reported to Facebook consisted of the following statement: ‘Let the Netherlands be free again’, meaning, free from the lockdown. Isn’t TLS being a little strict? No, says Manuvie. ‘There’s a good reason we included this post in our report. We had serious discussions about many of the posts.’
They assessed each post on its context, who posted it, and how the person it was aimed at could interpret it. Their conclusion on this particular post was as follows: ‘Sure, we’re in a lockdown, but so is everyone else. Saying your freedom is being curtailed is taking it too far. This isn’t Nazi Germany.’
Nutcases
But at the same time, she stands up for the ‘nutcases’. She’s not surprised by the comparison they draw between unvaccinated people and Jews during the Second World War. ‘Sure, they’re not being deported and murdered, but there are similarities to the way people talk about them’, says Manuvie. ‘As though they’re fundamentally wrong, retarded, mentally unstable. Treating them like second-rate citizens is wrong.’
People may have removed the posts themselves because they can be used as evidence
She was also dismayed that the Dutch government called these people, who were its own citizens, ‘nutcases’, she says. ‘In the end, they’re your citizens, your people. If you don’t talk to them, if you can’t explain why these restrictions are necessary, you fail as a government. It’s that simple.’
TLS reported each of those 938 posts to Facebook. They received an automated response almost every time. Sometimes a real person would respond, but the message was always the same: there were no grounds to remove the posts. When TLS published its report five months later, twelve of the messages had disappeared anyway. ‘Part of them were removed, I think, because they threatened Mark Rutte.’
Moderators
A year later, approximately 45 percent had been removed, she says. But she doubts Facebook removed them. People like Willem Engel might have deleted their own posts, since they can be used as evidence against them in court.’ She also doubts there are any moderators at work.
The moderators that work for the Dutch market do so from offices in Berlin. It’s a difficult job, since people have to read some pretty intense posts and look at awful pictures, a former employee said during a VPRO radio documentary.
Manuvie argues that in addition to moderating posts, Facebook should also arrange for independent supervision to prevent the spread of inflammatory posts. In the meantime, she’s calling on people not to use Facebook.
Students probably won’t have much trouble with that. Many of them consider Facebook dead and buried; they prefer Instagram. ‘Good’, says Manuvie, ‘I’m glad.’ Although, she says: ‘Instagram is owned by the same company. It lacks just as much transparency and uses the same guidelines to counter disinformation and abetment.’