Community standards or double standards when it comes to posts about Kashmir?

On 4 September 2019, a popular Facebook page with 37,000 followers, Stand With Kashmir, tweeted that their page had been blocked. “It looks like they are trying to shut down our page because we are the only page actively talking about Kashmir,” Mariah, founder and admin of the page told Soch. Mariah is a Kashmiri-American  based in Chicago. She created the page in 2016, but did not actively operate it until very recently. 

The move came a month into the Indian government’s military lockdown and communications blackout in Kashmir, which started on August 5 when India stripped Kashmir of Article 370 by presidential order. Soon internet and cellular excess in Jammu and Kashmir were cut off, and the region plunged into darkness. 

It has been about a month and a half since Kashmir was cut off from the world, and in the aftermath, people around the world have taken to social media platforms to voice their concerns and amplify the voices of Kashmiri citizens unable to communicate with the world. 

But just weeks later, major social media platforms began curbing these voices, and Stand With Kashmir became one of the first victims of this siege.

Community standards or double standards 

Facebook was founded in 2004 and has an average of 1.59 billion daily active users, making it one of the most used social applications around the globe. Over the years, Facebook has developed a 22 point Community Standards Guideline, which it uses to moderate content.

“Our Community Standards are the detailed and public set of guidelines that outline what we do and don’t allow on Facebook. They are designed to create a safe environment where people feel free to express themselves and share different points of view,” a Facebook spokesperson said in an email to Soch, “We work hard to enforce our Community Standards fairly and consistently across our diverse global community, and we remove content that violates these rules, regardless of who posted it and who reported it.”

Facebook removes content on their platform for various reasons. Some of these reasons are obvious, such as hate speech, fake news, hateful content, plagiarised content or content that incites violence. In just the first quarter of 2019, Facebook took down 4 million hate speech posts and took down 2.19 billion fake accounts. 

Here’s how their process works: If any content is violating Facebook community standards, it is either removed by the automated system or sent to human moderators. They have about 15,000 dedicated content reviewers around the world who review content that is reported by their community and flagged by their proactive detection technology. 

Facebook also has two other sets of guidelines — terms of service and the data policy —  which users agree to when signing up for an account on the website, and a violation of which may get accounts suspended. 

Mariah argues that the content Stand With Kashmir shares has never violated Facebook community guidelines, “We are a group of 30 volunteers and make sure that every post is following the community guidelines. We do not even share any Al Jazeera videos if we assume they may  remotely violate a set standard,” she explained. The page gained followers and popularity just last month, when the Indian government abrogated Article 370 and 35A.

But with popularity, also came hate.

On 4 August of this year — one day before India revoked Kashmir’s special status —  Stand With Kashmir created an Instagram account. Just days later, on 6 August, the account was suspended. “Our Instagram account got deleted three times,” said Mariah, “We created a new handle [instantly], @stand_w_kashmir, and that one was taken down as well. The message I got from Instagram when these pages were reported stated: “Your account has been disabled for violating terms”.” 

Credit: Mariah/Stand With Kashmir 

To bring the account back online, she drew into her personal network to try and find out why Instagram — which is owned by Facebook — shut down their account, despite the fact that they weren’t violating community standards. Soon after Mariah made several inquiries, Instagram released Stand With Kashmir’s handle and the page went back online. 

But the relief didn’t last long. A few days later, Mariah noticed a stream of comments on Stand With Kashmir’s Instagram posts asking other users to report the account. “They [Indian trolls] commented, saying, “Hi guys, let’s just report this account,” and then there was a stream of people just reporting us,” Mariah said, confirming that most of the profiles reporting Stand With Kashmir’s Instagram page were Indians.  When their handle got pulled down for the second time, they had to go through the same person again, “They basically had to explain to Instagram that they (Instagram) were curbing freedom of speech, and that the handle was doing nothing against the rules and could be audited,” she said. 

On 9 August, their captioning privilege was taken, meaning that Stand With Kashmir could no longer caption the pictures they posted on their handle “(Our page faced various restrictions) again on the 11 August and on 22 August. In one instance, a protest video that was posted received an error message stating that the video couldn’t be played in certain countries,” Mariah said.  

The third time their Instagram page got deleted was the most curious. “The Instagram representative told my contact that the page was ‘accidentally’ deleted and they are trying to figure out who did this,” said Mariah, wondering how a social networking service as popular as Instagram could arbitrarily delete a handle and not have an internal track record. 

Stand With Kashmir isn’t a lone voice protesting against Instagram and Facebook’s censorship of those attempting to voice support for the citizens of Kashmir.  Posts by prominent activists and journalists in Pakistan were also taken down, when they posted videos depicting the conditions in Kashmir. “Facebook should not succumb to the pressure of any intelligence agency, whether Pakistani, Indian or any other. It should maintain its own standards. We are talking about RIGHTS and not scoring political points,” said Anees Jillani, in a Facebook status published after his post was taken down for allegedly violating community standards. 

What was particularly strange was the incident that happened with Atif Tauqeer, a highly followed German-based Pakistani journalist whose posts date was changed. 

None of the posts or pages which were sent notifications by Facebook had violated any evident community standards as claimed in the notifications they received. And Mariah believes that Facebook needs to work on their standards, “I have reported bullies, harassers and violent comments multiple times to Facebook but their replies have been unsatisfactory, stating that these comments do not go against their policies,” she said adding, “If bullying does not violate their rules how do other people reporting our page get noticed?”

She also awaits a response from Facebook for the appeals she sent to explain Stand With Kashmir’s crucial role during the current situation of the region. 

Other pages have also faced similar accounts. Pakistan Defence, for instance, awaits explanations from Facebook regarding a notification that their page could be unpublished. “Dear Facebook, we request you to please provide us with an explanation on how exactly this post, which is sourced from a mainstream broadcaster goes against FB community standards?” said Pakistan Defence a post on Facebook. 

Freedom of speech is limited

Unsurprisingly, this trend is consistent across social media platforms. Twitter too has been suspending accounts and removing posts that are speaking against the atrocities in Kashmir. 

Bushra Iqbal is a radio jockey at Radio Pakistan. Her account was suspended after she re-tweeted some anti- BJP tweets. When she tried to log in, she received an overlay that her account was suspended until further notice, “I did not receive any notification but only realised it when I tried logging in,” she said. 

Tweets by bushraiqbalhusa

Numerous other accounts were suspended without any notice. A leading Pakistani news outlet reported that 200 Pakistani Twitter handles had been suspended corresponding to the lockdown in Kashmir.  

Javeria Siddique, a Pakistani journalist, also asked her followers on Twitter to help her trace suspended accounts. People were seen sharing screenshots on her thread of such suspensions or blocks. 

Feriha, who had been compiling a list of people whose Twitter handles had been suspended, found out that her account had been restricted on 18 August. “My account came to Indian government’s attention when I started replying to popular Indian trolls. My account was suspended two days after Indian home ministry wrote a letter to Twitter India to take action against Pakistani handles,” Feriha said. Twitter notified her that her account was suspended for impersonation, and couldn’t be restored again.

She appealed to Twitter, who refused to restore her handle. It was only after DG ISPR Asif Ghafoor tweeted asking Pakistanis to mention suspended accounts that her suspension was noticed, “Within two days, my account was restored without citing any reason for suspending & then restoring it,” she added. Since then all her past tweets have vanished.

Even tweets by the President of Pakistan were reported.

Ironically, Twitter is being used by pages to announce Facebook’s attempts at stifling free speech.  Indus Broadcasting Cooperation was blocked from posting and was at risk of being unpublished from Facebook for sharing protest videos from Srinagar, Kashmir on their page. They then took to Twitter to announce Facebook’s attempts to block the page. Their page has been restored but IBC too, is waiting for a response from Facebook, as to why such an action was being considered, if at all. 

The most recent case was reported by Makhdoom Shahab Ud Din, who said that his page with millions of followers was banned because he spoke for Kashmir. Beginning of September 2019, Shahab noticed that his reach had drastically decreased, he messaged Facebook asking for a reason, “They replied stating that I should check my content but there isn’t anything wrong on our end,” Shahab said. On 10 September, he received an official notification stating that his posts won’t be shown on news feeds, “I have attended Facebook meetings and I am a partner, I am sure that the content I am sharing is not violating any Facebook community standards,” he said, adding that he has not even received any notifications of his page being mass reported, “The only reason I can think of is that it was because of the content I shared about Kashmir recently, which has got the least engagement.”

Pakistani authorities, including the DG ISPR claimed to have taken the case with Twitter and Facebook on 18 August. 

Disparity in the system

Social media is used to connect people and bring the world closer together. Sites like Facebook, Twitter and Instagram are also advertised as “safe spaces” where discourse is promoted and perpetrators of hate speech and violence are barred.  

Facebook’s community standards are in place to tackle issues including violence and incitement, dangerous individual or organization, promoting or publicing crime and coordinating harm.

Yet, Soch found a number of Facebook groups violating community standards on a regular basis and operating without restrictions. 

Numerous pages have been producing anti-Pakistan content, constantly mocking Pakistan and producing organized hate towards it. They directly violate Facebook community standards 11 — regarding hate speech — and 15 — regarding cruel and insensitive content.

Pakistan murdabad, has been operating since 2014 and has numerous, insensitive memes, propagating hate towards the country and its people. 

Other pages including Kalkari Creations Hindu NationalistI am with BJP and RSS and Pakistan Ki Tabahi are also violating standards 11 and 15. 

Soch also discovered several pages violating Facebook community standards regarding coordinated harm and incitement of violence. Kashmiri Hindu Front- KHF2, for instance, is a page that promotes the Indian occupation of Kashmir and posts memes, articles and opinions justifying India’s clampdown on Kashmir. The page has also called for incitement of violence, called for organized hate led by dangerous organizations, promote crime and coordinate harm.

Azad Essa, a New York based journalist, believes that the disparity could be because of India’s influence on the world of business and technology. 

He tweeted stating, “Who are Kashmiris meant to turn to if Google and Facebook  are buddies with those who are going to take over their homes?” His thread explains more. 

What are social media organisations up to?

Shmyla Khan, Programme and Research Manager at the Digital Rights Foundation, believes that social media networks need to be more transparent about their practices and not hide behind the rhetoric of free speech. “There is a lack of transparency regarding the reason for the take-downs and suspensions, it is unclear whether these have been due to government requests or mass user reporting,” she said, adding that it has emerged that while government requests were made, a bulk of the suspensions have been due to mass reporting. “This speaks to the susceptibility of these platforms to manipulation by bad actors looking to silence and spread disinformation.”

Mariah believes that social media is an effective tool to join forces at this crucial time for Kashmir. “Our goal is to connect all advocacy efforts and advocates so we can streamline and unify these efforts and ensure the narrative shared is coming from the ordinary Kashmiri trapped behind barbed wires,” she said, but also believes that if their voices are being curbed it is a clear case of discrimination. 

While users are oblivious to the reasoning for a take-down or suspension the discriminatory impact of the set guidelines are in fact censoring legitimate speech that highlights crucial human rights issues, “Something is fundamentally broken in the content moderation system,” said Shmyla. “Tweets relating to human rights concerns are framed as an issue of terrorism. Given that this comes at a time when the residents of Indian occupied Kashmir cannot speak for themselves due to a communications blackout, it is all the more unconscionable that speech regarding Kashmir is being censored like this,” she added. 

Making Mistakes

An inside source at Facebook confirmed that the organisation does, in fact, make mistakes and have been working towards enforcing their policies accurately and consistently, “The reason they expanded their appeals offering last year, was to give people the chance to request a review if they think a decision was made incorrectly,” they said. 

They also clarified that it was a common misconception that mass reporting leads to the removal of content. Facebook only removes content when it violates their Community Standards, as soon as they are made aware, no matter who reports it. A single report is enough, “If a piece of content does not violate our Community Standards, it will not be removed from the platform, whether we receive one or one thousand reports,” the source said.

About the page and posts by Stand with Kashmir and Atif Tauqeer, the sources explained that Facebook’s automation tools are designed to detect and block spammy content, malware or anything else that could compromise account security, “In this instance, these tools mistakenly detected something on the Stand with Kashmir and Atif Tauqueer pages that they thought was spam, which prompted temporary feature blocks to be placed on the page,” they said. Upon further review, Facebook has determined that this was an automation error, “Therefore the feature blocks have been lifted on the pages, and they are fully operational again.”

Regarding posts by Indus Broadcasting and Pakistan Defence, the source claimed that both pages shared content depicting designated dangerous organizations which is not allowed under Facebook’s Dangerous Organizations policy, “However, there are exceptions when this content is shared neutrally in a news reporting context, and therefore the content on both pages have been restored,” they said. 

They did not respond to why other pages Soch mentioned in its research were banned, or their posts removed. In addition, Facebook did not respond to a request to share their list of dangerous organizations and individuals. This confirms a lack of transparency on Facebook’s part regarding their policies, which has made it difficult for users to use the platform, and confirms that post removals may be unavoidable in the future.


Originally published on Soch Writing

Advertisements