The recent white terrorist attack on the Muslims worshipers in the two mosques of New Zealand killed 50 innocent people. That attack overall was a horrible incident. However, the worst part of this deadly violence was that killer continued all his shooting live-streamed on his FB account. The live stream of killing bared people raised a lot of questions for the social media group. In the response of the criticism from the human-rights organizations and the international media, Facebook promised to improve its capability to recognize and block the terrorist group’s material.
The users of Facebook who searched for law-abiding terms will be head to the charity that combats far-right radicalism. Previously, Facebook had allowed some content regarding white nationalist that they do not take it as racist. It includes permission for the users to appeal for the creation of ethno-states for white.
The giant of social media said that it had judged white nationalism a suitable type of expression on a parity with things like Basque separatism and American pride that are an essential part of the identity of people. However, one author said in a blog post that after the consultation of three months with members of academics and civil society, it is clear that the meaning of white nationalism is not different from white supremacy and planned hate-groups.
Responsibilities of Social Media Companies
Many leaders of different countries of the world emphasize the social media groups to take serious responsibility to discourage the extremist material posted on their platforms. In the response of the terror activity, the NZ Prime Minister Jacinda Ardern said that the publisher of the violence was social media network, not just the postman, about their potential liability for the shared material.
Facebook accepted that the attack video on two mosques viewed more than four thousand times before it takes down from the killer’s account. The officials of FB stated that they had blocked 1.2 million copies during the process of upload within 24 hours. Furthermore, they had deleted another 3 lac already posted copies. YouTube and Facebook both sued by a French Muslims group for allowing the video to share on their platforms.
Moreover, other groups about tech also took affirmative steps to shut down that video sharing. Reddit blocked the standing discussion forum on its platform named ‘watchpeopledie’ after the sharing of the attack video clips. Steam gaming network’s owner Valve removed over a hundred tributes to memorialize the killer by the users.