A trove of internal documents show Facebook didn’t invest in key safety protocols in the company’s largest market.
At first, her feed filled with soft-core porn and other, more harmless, fare. Then violence flared in Kashmir, the site of a long-running territorial dispute between India and Pakistan. Indian Prime Minister Narendra Modi, campaigning for reelection as a nationalist strongman, unleashed retaliatory airstrikes that India claimed hit a terrorist training camp.
Soon, without any direction from the user, the Facebook account was flooded with pro-Modi propaganda and anti-Muslim hate speech. “300 dogs died now say long live India, death to Pakistan,” one post said, over a background of laughing emoji faces. “These are pakistani dogs,” said the translated caption of one photo of dead bodies lined-up on stretchers, hosted in the News Feed.
An internal Facebook memo, reviewed by The Washington Post, called the dummy account test an “integrity nightmare” that underscored the vast difference between the experience of Facebook in India and what U.S. users typically encounter. One Facebook worker noted the staggering number of dead bodies.
End of carousel
About the same time, in a dorm room in northern India, 8,000 miles away from the company’s Silicon Valley headquarters, a Kashmiri student named Junaid told The Post he watched as his real Facebook page flooded with hateful messages. One said Kashmiris were “traitors who deserved to be shot.” Some of his classmates used these posts as their profile pictures on Facebook-owned WhatsApp.
Junaid, who spoke on the condition that only his first name be used for fear of retribution, recalled huddling in his room one evening as groups of men marched outside chanting death to Kashmiris. His phone buzzed with news of students from Kashmir being beaten in the streets — along with more violent Facebook messages.
“Hate spreads like wildfire on Facebook,” Junaid said. “None of the hate speech accounts were blocked.”