Case Analysis on Information Warfare

In the wake of 2016 U.S. elections, information warfare has become a major public security and democratic issue. Evidence of fake news and Russian interference with the U.S. election have raised alarm on the level of influence that social media network hold over social institutions. Disinformation and misinformation have become major tools for government and non-government actors. Facebook has always been viewed as a non-neutral force in electoral politics. However, that view has changed drastically as law makers, activists, policy makers and scholars have demonstrated how powerful the social media has grown. Facebook has the power to alter, change or influence peoples access to information using powerful algorithms that direct informational resources. The company is on the spotlight over lack of adequate firewalls and human resources to differentiate between accurate and false information. In this case analysis I will argue that contractarianism, shows us that while Facebook is not wholly responsible, they are liable and responsible for disinformation and weaponization of the platform by actors.

Looking at Jarred Prier’s “Commanding the Trend: Social Media as Information Warfare” he notes that in contrast to traditional forms of cyberattacks, modern cyberwarfare influences people’s beliefs or behaviors and diminishes their trust in the government. Prier highlights two case studies of social media application in misinformation and disinformation or propaganda. First, social media was widely used by the Islamic State in 2014 to spread propaganda and recruit new members including in the West. Second, Russian hacking, disinformation, espionage, and the manipulation of social media to influence the direction of the United States Presidential elections in 2016. Russia’s campaign included both Facebook and Twitter to launch a disinformation campaign that started two years before the elections. Prier notes that social media sites such as Facebook and Twitter employ specific algorithms to analyze phrases, words or hashtags and create topics by order of popularity and relevance to people in different geographical regions. State or non-state actors can easily manipulate what is trending on social media using three methods of trend control. These are trend hijacking, trend creation, and trend distribution. The three methods all attach different messages to the trending topic and use it to spread misinformation. In the case of the Russian interference with the United States election in 2016, the information warfare was aimed at a target bigger than the election. The focus on the operation alone does not take into context the underlying conditions that allowed for the spread of false narratives in the first place. Prier concludes that propaganda is a powerful tool and when effectively used it has the power to manipulate masses. Social media provides a platform to take any trending topic and manipulate that information at a massive scale. To this end, both state and non-state actors have the tools and the knowledge to weaponize social media information. Since people rely on trending information it becomes easy for the information or trends to be hijacked and false or manipulative information added to the trends. The dilemma occurs where social media companies must balance between their businesses and betterment of the society. As an outcome, proper mechanisms have not been put in place which allows actors to influence the narratives and the will of the people. Thus, compromising peoples trust in social institutions and their ability to establish integrity of the institutions. Facebook is partly responsible for erosion of democracy and public confidence in the government. Facebook has a social contract to promote positive information access not just profits to its shareholders. Social contract is based on contractarianism which is a moral theory that claims that there is an unspoken social contract between all members of the society. Contractarianism theorizes that there is no “natural law,” instead “morality” is based on rules that allow people to live together in harmony. The social contract applies to all members of the society and ensures just and fairness in social structures and institutions. In this case, Facebook has an unwritten contract with users to ensure that the platform is used for positive purposes including supporting the freedom of information. However, Facebook’s business model of including ads did fail to create effective firewalls and accountability structures to ensure that the platform is not used to spread misinformation. As seen with the Russian interference of U.S. elections in 2016, thousands of ads were purchased through Facebook which influenced the direction of public opinion and how American people voted. While Facebook cannot monitor over 2 billion users on the platform the company did not establish enough tools and resources to protect democratic institutions from propaganda and disinformation. In this case, Facebook features including News Feed and advertisements were hijacked to create or change narratives. Facebook has the responsibility to ensure that their systems are not utilized to attack democracy through misinformation or other means, a duty that the company should treat very seriously.

In Keith Scott’s, “A Second Amendment for Cyber Possession, Prohibition and Personal Liberty for the Information Age,” he notes that cyberspace is made of relationships and transactions and that each actor is both a consumer and a producer. It is this important aspect of virtual spaces that make them difficult to govern, manage or regulate. The internet is a non-physical domain, omnipresent, always on connectivity and which poses an immense insolvable technical and social insecurity. The ability to link devices on a global scale is extraordinarily complex and presents huge risks for individuals and national security at all levels. The possibility of compromise is always high as it allows anyone to produce unregulated information. Scott notes that technology does not just facilitate human interaction it also changes it. Various technologies including social media alters individuals’ ability to circulate ideas and stories and the way that people connect and converse. Technology also determines the people we interact with and the things that we see as well as the structures of power in these interactions. Technology offers a legitimate platform for voices to be heard and to impact change. It also offers identical power to voices of extremism and bigotry. Scott does not call for the regulation of the internet noting that the internet is a right and not a privilege. Rather, the discussion is more focused on actors’ decisions on the internet, particularly where it is used to disseminate hate and misinformation. Attempting to ban or regulate the internet and smart devices is not the solution and is completely unworkable. Rather, it is important to understand that technology does not just help make society more functionable, it also shapes the direction that it functions. Hence, there exists a social contract between users or actors and the technology. Scott concludes that learning from the past human beings have managed to control the excess of human behavior. This means that progressively actors must look at ways of managing the excesses of the internet without damaging its uniqueness as a tool for free information dissemination. Examining United States democracy from the perspective of Scott, actors including Facebook, government, and users failed to understand the nature of the social contract when technology is involved. Facebook is liable for failing to establish strong firewalls to prevent the platform from being used to disseminate propaganda and influence users. When state actors such as Russia can influence U.S. democratic freedom then the negative side of the internet comes out. However, exploring the social contract between social media platforms such as Facebook, allows for a more in-depth approach where not just one actor is liable, but all actors using social media. For example, liking and sharing trending false information allows such information to spread more rapidly. Madrigal notes that because algorithms and artificial intelligence are designed to capture key words, phrases, and terms, it becomes easy for false information to spread much more rapidly. Platforms such as Facebook failed to create algorithms were powerful enough and trained to detect misinformation and false news feed links and data. While technologically it is a challenge, Facebook also failed to invest in the human resources required to continuously monitor information sources as well as advertisement purchases. As noted by Scott, managing, and controlling excess of social media platforms is key to its positive use. While this is a difficult endeavor all actors must be held liable as part of the social contract.

Humanity is based on an unspoken social contract between all members of the society that they will act or behave in a certain moral way. However, the social contract is not always binding and is often not respected by various actors including states, individual actors, and corporations. Human beings are inherently wired to pursue personal needs and at times it can come at the expense of others. Facebook as a platform with over two billion people, is not exempted from these limitations of social contracts. Facebook management is liable for establishing less than adequate mechanisms to curb dissemination of false information or propaganda. Apart from Facebook, users who are both consumers and producers are an important node of the social contract. Many users like and share trending information they find interesting. In most instances, such information can be morally and ideologically misleading. The scale of social media platforms means that the source or authors of such information is not always known. Hence, users must practice responsible handling of information. Additionally, due to the threat posed by state actors or non-state actors with ideological goals the threat to democracy is even bigger. As an example, Russia’s interference of U.S. elections demonstrates how social media can potentially be used to erode democracy. Facebook is responsible for failing to establish proper systems to audit information sources and curb disinformation. While the scale of the internet and Facebook makes it a challenging task the platform can still make progress in curbing disinformation. At the same time, all users or actors on the platform must also demonstrate responsibility towards the unspoken social contract as the internet is designed to promote freedom of information for all.