Facebook did engage in Information Warfare in the 2016 election. Facebook did engage in Information Warfare because there were cases of ‘fake news’ and mass amounts of misinformation used to sway the public in one way or another. This was caused by the algorithm based content system that only showed people what they’ve associated with before. This led to disinformation within the platform, and created biased points of view for people in favor of Clinton or Trump. Facebook took part in facilitating ads that were false or spread misinformation. During the time of the 2016 election, this platform was subject to Russian bias. The Russian operatives used this platform to change the outcome of the election through the use of Facebook’s content distribution that would go on to divide the American public. In this Case Analysis I will argue that contractarianism shows us that Facebook did engage in information warfare because they were responsible for what goes on in their platform, and further that they were partly responsible for the election outcome because of their algorithms that were not transparent to their users.
Firstly, Prier talks about the age of mass communication, and how this has changed how information is being distributed and received. This causes more opportunity for mass disinformation and propaganda to exist on a public platform like Facebook for example. Prier states that for propaganda to flourish, it has to build upon an already existing story/narrative. This is easily accomplished on social media platforms where anybody can post their ideas openly for the world to see. This idea presented by strangers on the internet is somewhat innately trusted because it occurs on a trusted platform. Even if a story is riddled with disinformation, with enough shares/likes a story can be construed as fact. Thus this information is likely to be shared to thousands of more people, and this information can be widely known as truth even if it is not. Furthermore, one of the key points of propaganda that Prior discusses is that the information has to resonate with the person who is targeted. If this is the case your bias will greatly increase the chance that you will believe information online without fact checking the source. Without this component we will see that misinformation will be less effective. That’s why Facebook’s targeting ad system is unethical because they could be unintentionally spreading misninformtaion.
This presents a major problem for social media platforms like Facebook, where people go to get information on current events. This such case occurred in 2016 when millions of Americans were influenced by misinformation on the Facebook platform prior to the presidential election. In the article, “What Facebook did to American Democracy”, it’s stated that when Facebook ads are present, those ads are likely to sway people to vote in a certain direction by nearly 20 percent. This study alone shows why it’s important that Facebook is aware of what goes on in their platform. Facebook played a role in the outcome of the election based on their targeting ad system. By way of this wave of disinformation, our electoral system was impacted in a negative way. The ease of access of getting information through one source whether true or not was disruptive towards the election results. In this article, it’s also stated that not even Facebook knew of everything that was going on surrounding their platform.
Facebook had an innate responsibility to the American people to present truthful and unbiased information on their platform. They failed to do so and even had Russian operatives influence the election without them knowing it. Facebook is partly responsible for the outcome of the election, even if it wasn’t them who committed the direct act of spreading misinformation. Facebook should have been aware of what was going on regarding their platform. Furthermore, Facebook should have been transparent with their audience, and explained how their targeting ad system worked. Like the articles stated, Facebook should have owed it to their customers to share political ads leading up to the election and how they were being distributed to different types of people.
Secondly Scott talks about Open Source Warfare to some ideas from Clausewitz. Scott then continues on to compare Open Source Warfare to the “4th Generation of Warfare” and talks about this idea from Echevarria can be a good baseline to try and define the complexities of the cyber domain and how this will impact our future in a military and political aspect. The 4th generation of warfare refers to the tactical impact that non-state actors can have on the cyber domain politically and militarily. I think this idea of comparing OSW to 4th generation of warfare is a great comparison because we have never seen non-state actors have the impact they have today in a warfare environment like the one they are having in the cyber realm. This is even more complex because non-state actors can attack through the means of non-attribution. This makes it more compelling for non-state actors to fulfill their agenda through the means of cyber warfare.
Scott then brings OSW into the perspective on how this can impact the political sphere within the United States. He lists each of the three parties in the political sphere as ‘weaponizing’ social media to push their agendas. While cable news is on the decline, our society uses social media now more than ever.
The concept of OSW related well to the events on the platform Facebook that led up to the 2016 presidential election. There were many examples of OSW that occurred during the political campaign for each of the political parties. Each of these political parties were made up of many individuals who used readily available technology and information to push their agenda to the common Facebook user. Some of the misinformation that was being shared was more effective because the individuals who pushed out the information shared the same ideals.
This form of Open Source Warfare occurred in this format through the use of social media. Like mentioned before, this swayed users on Facebook to vote a certain way. Thai occurred on Facebook’s platform, so they should be held accountable for the misinformation that was spread partially through the use of their algorithm based content system. In this situation Facebook should have taken the initiative with the election coming up, and looked out for misinformation for one party or another. Facebook claims to be a neutral platform, so I would expect that they would operate in such a way that they wouldn’t have political content only shown to certain people. Like mentioned earlier, Facebook claimed that they did not know about the Russian interference with the election. This can happen because users who use Facebook as a platform trust the company on being shown to them. Facebook did not operate with their user in mind, and there was no mutual agreement between the user and randomly targeted ads towards specific political affiliations. If Facebook as a company expects to be a host of current events I would hope that they would be more transparent with what they are showing their users in the future.
In conclusion I think that Facebook did engage in information warfare. Facebook allowed misinformation on their platform during a sensitive time during the year where it’s widely known that misinformation can be spread. Facebook owes their users a platform where they should know that they are being treated fairly with consent. However, this was not the case because Facebook was unaware of some of the IW that was taking place on its platform, and they caused disruption through the use of an algorithm based content system. While Facebook should not be held completely responsible for any outcome, they still played a part in allowing users to be misguided by misinformation. Overall, I think that there could be a case to be made that Facebook did not engage in information warfare because the users have the responsibility of discerning what they are accepting as truth. I think this is a very understandable/equally correct point of view that directly goes against my argument.