
The Silicon Valley mantra of ‘move fast and break things’ has fallen miserably short
http://tech.newstatesman.com/guest-opinion/move-fast-break-things-mantra
When Mark Zuckerberg developed Facebook, he probably never expected to testify in front of a panel of legislators on user privacy or navigate debates on censorship and content regulation. In his recent Congressional testimony, Zuckerberg admits as much: “We have a responsibility to not just build tools, but to make sure these tools are used for good.” This responsibility is not limited to the Cambridge Analytica scandal: considering the proliferation of hate speech and extremism, the effects of Facebook’s failure to view their responsibility in holistic manner become even clearer. By opting to react to these challenges rather than anticipating their emergence in the first place, social media giants have allowed hate and extremism to spread at unprecedented scales. To get ahead of this problem Facebook (as well as other social media giants) need to invest in the tools to anticipate these risks.
The extreme right-wing group Britain First highlights the importance of this kind of investment. For years, Facebook tolerated Britain First’s presence despite evidence it was exacerbating anti-Muslim hate. Their argument had been that Britain First was engaging in offensive but protected political speech. This seems ignorant of the dynamics of coded language that the group uses precisely in order to circumvent these laws. Common claims that ‘Mohammed is a paedo’ and references to ‘Muslim rape gangs’ imply that all Muslims have tendencies towards sexual violence (a well-known trope among scholars of anti-Muslim hate). The implication of such a comment is obvious to an observer, but their tactical use of language allowed them to flout Facebook’s content moderation. Through their page, they coordinated a ‘mosque invasion’ campaign, blogged video updates on their ‘Christian patrols’ that harassed residents of British Muslim neighbourhoods, and advertised merchandise. These activities grew rapidly from 2014 to 2018, until its leaders, Paul Golding and Jayda Fransen, were convicted of hate crimes and Facebook shut down their page. By then the damage was done; they connected hundreds of anti-Muslim activists and were propelled to international notoriety after Donald Trump retweeted a video posted by Fransen on ‘Muslim crime’.