Self-Modification by social media platforms has struggled to ensure that the Internet is acceptable to all cultures, religions, political persuasion, and to all values of countries. This exercise in Internet Governance is tall and detailed and comes under criticism for being too heavy handed towards exercising censorship. Though easy to say, it becomes difficult to fathom when realizing complexities across borders. It is also complex when the business of platforms is to solicit the participation of content submittal for all to see and be exposed to. Such solicitation opens platforms to host content that will at some point cross an unacceptable line. The underlying aspect is that the role of Internet Governance is put forth by hidden unelected individuals having a say on what is right and wrong in the public domain. Nevertheless, biases and interests could translate into the decisions of what to allow on the platforms.
The issues run across many sensitivities considering objectionable content. The regard for decency and respect, but still allow for freedom of expression has certainly been tested. The responsibility has primarily rested on the platforms efforts to self-police the flow of content applying a form of “content modification.” Posting of violent incidents or committed acts of violence draws one to question the purpose of showing videos of a beheading, animal cruelty, a child being mutilated, a public flogging, or a child’s circumcision. Easier to accept has been public political demonstrations with incidents of restrictions in certain regions of the world.
The argument for free speech is not a free for all as some would imagine when it comes to implementing content policy. With over hundreds of millions of videos uploaded daily, the task is monumental. It is an internationally impactful role that impinges on political interests and national policy interests. The major social platforms are employing and institutionalizing global content review teams setting policies. A violent event could be deemed to have political and news value. A comedic video could be deemed socially acceptable to some and objectionable to others along the lines of being sexually offensive. In a sense, the global content policy initiative by the social media platforms are functioning as the eyes and ears of the user public trying to discern right from wrong without a definitive line but working with a moving value target.
The implementation of self-modification by YouTube, Facebook, and Google, begs the need to set requirements to define lines of acceptability. The distinctions fall under categories of journalist value, political interest, cultural value information, educational, social creativity and expression; and this is just to name a possible few. The power of what is allowed to be posted by them on the Internet could have endless political significance for a country trying to keep suppressing its citizens, for a social or issue driven cause prior to an election, or for promoting a war or an invasion. The result of exercising discretion over what is allowed to be posted could drive the direction of politics, social development, development of new laws, increasing government control, eroded national borders, or even blur the legal definitions of freedom, defamation, invasion of privacy, and human decency in the public domain. The startling realization is that the self-modification effort is done by entities and their employees who social media users did not elect to tell them right from wrong in the public domain.
Because of the concern for addressing Internet Governance groups have been organized. Many of these groups work in group associations such as the Global Network Initiative, Anti-Cyberhate Working Group, Safety Advisory Board (Facebook), and Truth and Safety Council (Twitter). While their stated goals may be lauded, their lack of transparency with their closed meetings causes to concern and engenders a distrust and doubt for their accountability to civil society and everyone’s use of the public domain.