7.4 Case Analysis on Information Warfare

Did Facebook engage in Information Warfare? Why or why not? Is Facebook responsible in any way for the outcome of the 2016 election? Why?

Alexis Madrigal’s article, “What Facebook Did to American Democracy,” focuses on the distinctive change in how we understand and interpret narratives, events, and news on Facebook’s information atmosphere. Most importantly, it focuses on how this revolution of information has been weaponized during the 2016 presidential election. As NATO states,  information warfare is the act of intentionally “controlling one’s own information space” (NATO, 2014) that disrupts their cultivated flow of information and diminishes trust in their government. The idea of information warfare is psychological and aims to change the narrative of any event in a controlled manner. In the 2016 presidential election, this tactic of information warfare on Facebook was used to help the Trump Administration win the election and set precedent on how information is consumed on social media platforms. In this Case Analysis I will argue that the Ubuntu tool shows us that Facebook did not engage in information warfare because their intentional actions were not involved in the act of information warfare, however, they were partly responsible for the election’s outcome because their platform enabled a four-dimensional sphere of disinformation that plagued American voters. 

Jarred Prier focuses his scholarly essay, “Commanding the Trend: Social Media as Information Warfare”, on discussing how the role of social media is weaponized and employed for modern information warfare. Prier discusses how social media contributes towards three distinctive subjects: social networking, propaganda, and digital information sharing. Social media is built on the traditional social nature of humans that expresses a common attribute around a group of users, known as virtual networks. Under the umbrella of social media falls the popularity of trends. Trends express what is popular during that period and connects a wide variety of virtual networks together. Many social media platforms use advance “algorithm[s] to analyze words, phrases, or hashtags to create a list of “(Prier, 2017) trendy topics that operate on all virtual networks on the platform. As Prier states, the openness of social networking enables the spread of propaganda through means of trends and user-based algorithms. In addition, adding “fake” or malicious bots to the platform can effectively and cheaply help boost/spread that misinformation or propaganda. Through the functions of social media, third-party adversaries can take advantage of the openness of social media to insert their own propaganda. Propaganda is important to contextualize because it adds an additional biased or misleading perspective to control the narrative in a story and needs two basic requirements to function: an existing narrative to build upon and a virtual network of users who already believe in the implicit theme of the platform. As more people believe in propaganda or disinformation on a certain platform, more users are inclined to share that information on their own virtual networks. Thus, creating a worm of misinformation that plagues millions of virtual networks that changes the public narrative based on user algorithms, trends, and the virtual network itself. In some situations, social media influences users to seek out news stories from other sources beyond the platform. At this severe level of propaganda, the source controls everything in their own domain and uses advanced tactics to further control the narrative. Over time, this propaganda platform will be trusted more than other legitimate news outlets in virtual networks. 

Prier’s scholarly essay is important to the Case Analysis because it helps determine if the actions of Facebook are considered an act of information warfare against the 2016 presidential election. Prier focuses his argument on how social media contributes to the acts of information warfare through three distinctive topics: social networking, propaganda, and digital information sharing. In the case of Facebook, they are a social media platform for virtual networks and information sharing, but Facebook does not control nor stand by any political view. In fact, Madrigal’s article states that Facebook has always remained neutral in their services and “they wouldn’t do it intentionally, at least.” (Madrigal, 2017) However, the unintentional design and by-product of social media enables other third-party media outlets to enact information warfare based on how social media functions. Therefore, the actions of the third-party adversaries spreading propaganda in support of the Trump administration would be considered an act of information warfare. While Facebook actions were not an act of information warfare, as Facebook was only the median the information warfare occurred on. They are liable to some degree, however, the evolution of technology inevitably developed social media itself and this information warfare could have potentially happened on other smaller social media platforms. Using the Ubuntu principles, we can determine the amount of humanness, or the level of humanity based on moral goodness through our interdependence and our membership in a community. Essentially, this means humanity is built around the idea that “a person is a person through other persons.” (umuntu ngumuntu ngabantu) The greater the interdependence people have in a community, the stronger humanity will be, the more fair and greater society will be. It’s important when relating Ubuntu ethics to the Case Analysis that we remove our own political opinions in the information warfare of the 2016 presidential election. Regardless of what your political side is, it’s fair to assume that the information warfare occurring on Facebook gave the Trump administration leverage over the democratic nominees through propaganda or misinformation. Using the Ubuntu ethics, how would it be fair to democratic officials if their ideologies aren’t being recognized or their message is being altered to the public? It’s only fair that through Ubuntu ethics that everyone is recognized through society with an even playing field of information so that voters themselves can fairly make the decision to vote in their own accord. Without propaganda nor misinformation, Ubuntuism emphasizes that everyone will be freer in society if everyone is heard and recognized. Thus, making the actions of the third-party adversaries, not Facebook, unethical in the Case Analysis.  

Keith Scott focuses his scholarly essay, “A Second Amendment for Cyber? Possession, Prohibition and Personal Liberty for the Information Age,” sparking debates about the current state of cyberethics and information warfare in modern society. In the context of Scott’s essay, information warfare poses more of a neutral approach that encompasses a double-edged sword. As Scott states “If everyone is a media hub, then we must recognize that many voices will represent opinions which we will find objectionable, offensive, and unacceptable; the same technology that allows demonstrators of Tahrir Square to voice their grievances.” (Scott, 2017) On one end, mass media hubs allow a platform of injustice and sheds light on civil issues occurring in society. In this approach of information freedom, techniques such as trend-hijacking shows the benefits of information warfare in society. On the other hand, mass media blurs where information originates from and what information is accurate or not. In this case, adversaries, trolls, and state-actors take advantage of the openness of social media platforms to add misinformation or propaganda. Once they have the audience, they can control the narrative and change the users’ political viewpoint. Scott’s empowering statement, “the mobile phone is, in many respects, [is] the AK-47 of the 21st Century,” (Scott, 2017) expresses how unregulated information sharing is and how powerful users can be by spreading misinformation. With this power, Scott argues that the current creation of information technology should implement some sort of “driving test” or proficiency exam to prove an individual can demonstrate ethical online behavior. Just like how firearms and cars can be dangerous to both the users and surrounding individuals, social media behavior can contribute to the harm of others digitally. Scott hypothesized that with these IT regulations there would need to be two minimum elements in their approach: A requirement for monitoring/ or blocking internet traffic through ISPs and a requirement for all internet users to display a minimum standard of ethical behavior through a tracked registration number.  

Scott’s interpretation of information warfare and cyberethics is important to the Case Analysis itself as it helps determine if the actions of Facebook were responsible for the outcome of the 2016 presidential election. Scott’s main argument focuses on cyberethics and how unethical behavior can contribute to information warfare. In addition, Scott debates that the modern IT structure should implement regulations to control online behavior to combat information warfare and unethical behavior. It’s important to note that the 2016 presidential election suffered the most from disinformation, misinformation, and “fake news” that supported the Trump administration. With Scott’s proposed regulations, this could have helped combat third-party adversaries from distributing the election. However, it’s difficult to tell. In the Case Analysis, Facebook represents the ultimate platform of which the information warfare occurred on. Rather, their actions didn’t directly contribute to information warfare, the structure and function of Facebook enabled this activity. Therefore, does hold them partly responsible for not enacting potential mechanisms to combat misinformation. Like Scott’s argument, Facebook should have licensed, regulated, or fact-checked misinformation from spreading on their platform affecting the presidential election. In fact, with Facebook’s lack of action taken, the 2016 presidential election was affected the most in benefit of the Trump administration. Again, the Ubuntu principle focuses on the amount of humanity, freedom, and fairness imposed in society based on the amount of interdependence and recognition of other community members. The more recognition, the stronger the interdependence, and the more fair society is. As we remove our own political viewpoint from the discussion, it’s important to understand the information warfare that occurred in favor of the Trump administration that negatively affected other pollical community members in the election. This caused an uneven level of information to occur that failed to recognize other participants in the political election. How is it fair to the other participants in the presidential election if their ideologies aren’t recognized or being altered against the community of voters? Therefore, Scott’s interpretation of cyberethics and the Ubuntu principle helps determine that Facebook is partly responsible for the outcome of the 2016 election and that the Case Analysis itself is unethical.

It’s no doubt that the information warfare that occurred on Facebook helped a wide variety of adversaries spread misinformation in favor of the Trump administration during the 2016 election. In Jarred Prier’s essay, he proves that the current structure of social media is problematic against misinformation or propaganda campaigns from spreading on its platform. Rather, the actions that Prier describes does not directly implicate Facebook as contributing to the information warfare. Instead, other third-party adversaries are to blame for the information warfare contribution. In Keith Scott’s essay, he debates current information technology regulations and expresses concerns in cyberethics.. By comparing our mobile phone to the 21st century AK-47, Scott debates the need for regulation of online behavior to combat information warfare and unethical behavior. While Facebook did not directly contribute to the 2016 information warfare, we can parallel Scott’s argument to Facebook, as the lack of regulations or policies on the platform enabled information warfare to assist the Trump administration. Therefore, Facebook is partly responsible for the outcome of the 2016 election. Using the principles from the Ubuntu theory, we can determine how fair, humane, and free a society or community is through the amount of interdependence and recognition shown between one another. In the controversial issue regarding propaganda assisting the Trump administration, other political figures in the election, such as Hillary Clinton, were unfairly unrecognized and their message was secluded in the information atmosphere on Facebook. Thus, limiting and dampening the amount of humanity, fair, and freedom given in the community of American voters.

References

NATO – Defense Education Enhancement Programme (DEEP) . (2014). MEDIA – (DIS)INFORMATION – SECURITY. https://www.nato.int/nato_static_fl2014/assets/pdf/2020/5/pdf/2005-deepportal4-information-warfare.pdf