Facebook and Information Warfare

In the article What Facebook Did to American Democracy by Alexic Madrigal, Madrigal outlines how there has been studies on how Facebook has had an impact on elections since 2012. Madrigal goes on to highlight how Facebook targeted individuals with a heavy personalization aspect to push specific advertisements and content to users. When engaging with posts they agreed with they would receive more of those posts and less of the posts they do not engage with. This developed a social structure where individuals would rarely be given posts by an opposing viewpoint and only be given posts they are likely to agree with. This type of structure heavily encouraged a concentration of viewpoints and widened the schism of ideals within American politics. The increased additions to the Facebook social media site lead to more users interacting with these new additions such as Facebook videos. The increased use of these videos led to media companies having considerably less standards when scrutinizing reports, to release videos in quantity. The creation of hyper-partisan media sites continually siphoned partisan Facebook users to be exposed to their hyper-partisan views. This continued the schism and influenced users to become increasingly more radical due to this exposure. Madrigal describes the media site Breitbart as the largest example of this. Hyper-partisanship within Facebook influenced the quantity of misinformation among the site, to curtail users into these partisan sects of the social media company. In this case analysis I will argue that consequentialism shows us that Facebook partook in information warfare because the consequences of Facebook’s desire to grow led to manipulation of users, and furthermore this manipulation makes Facebook partially responsible for the outcome of the 2016 election. 

In the reading Commanding the Trend: Social Media as Information Warfare by Pier, Pier presents a central concept that use of trend topics within social media sites allows for the manipulation of these topics to manipulate viewpoints. This means that organizations can artificially create topics that trend on social media in order to present a specific viewpoint. Regarding Facebook, the use of social media trends to push a specific viewpoint had a considerable impact on the results of the election and the viewpoints of the people using Facebook. Within the article, Madrigal highlights misinformation campaigns using this trend feature within Facebook to draw engagement from as many people as possible, furthering the amount of people who see their post. This perpetuates a loop where people continuously see misinformation in their feeds, as well as the hyper-partisan sites who push this misinformation. This continuously exposes people to these hyper-partisan views, encouraging them to adopt similar viewpoints. Thus furthering the schism between ideologies, and increasing the polarization of American politics.

This highlights Facebook’s engagement of information warfare. Facebook supplemented the growth of this misinformation through their algorithm. Facebook’s trending algorithm was used as the means to feed American voters misinformation and manipulate their viewpoints. The lack of action taken by Facebook to limit misinformation or hyper-partisanship within their site led to this manipulation. The encouragement from Facebook’s personalized advertisements and feeds continued the  Therefore Facebook is enacting information warfare on American voters. This allowed hyper-partisans to use these systems to influence the 2016 presidential election. Due to the means of these hyper-partisans used, Facebook is partially responsible for the influence of voters and therefore the outcome of the 2016 presidential election.

Using the consequentialism tool Facebook had the opportunity to prevent the unethical use of their trending algorithm and encouragement of hyper-partisanship to influence the 2016 election. In order to abide by the consequentialism tool, Facebook should have put in place a system that verified the factual accuracy of these trending posts. This would have allowed viewers of the post to be more informed of the accuracy of the post and therefore come to a more educated decision when voting. Users making educated political decisions is a good consequence regarding a democratic nation and mitigates the bad consequences of misinformation. Additionally Facebook should have put a rating of integrity and partisanship of posts posted by media sites. This would have given Facebook users the ability to understand the bias within the post and whether the information within the post was manipulated to encourage a particular view relating to it. Presenting a bias checker mitigates the bad consequence of hyper-partisan groups looking to manipulate American voters, and encourages the good consequences relating to near neutral media sites looking to inform voters. Facebook should have implemented a feature within their algorithm for users to see the context of posts. This would allow Facebook users to understand whether a post was taken out of context, and make the decision on if the context influences the post or not. This encourages the good consequences of Facebook users taking context in consideration of their decisions, and mitigates the bad consequences of manipulation of context to push forward a particular viewpoint. However none of these actions were taken in regards to the 2016 presidential election because Facebook only wanted good consequences for their company, and not good consequences for the world.

In the reading A Second Amendment for Cyber? – Possession, Prohibition and Personal Liberty for the Information Age by Scott, Scott establishes a central concept that outlines information and the spread of information is difficult for organizations to subvert. Using this to analyze the case, Facebook was the tool to spread misinformation regarding the 2016 political election and their related campaigns. The use of this as a tool created a difficulty for Facebook and the United States government to subvert the misinformation spread with the use of this social media site. For Facebook, their primary goal was to encourage as much engagement as possible on their platform. That is why the algorithm was designed to reinforce and push posts and media sites that received the most engagements. The most engaged sites were hyper-partisan due to their polarizing nature. 

This proceeds to showcase Facebook’s engagement in information warfare. Facebook refused to limit the use of their site as a tool for misinformation. Facebook business practices encouraged the company to obtain their primary goal of competing with other social media sites. Engagement within the site is the primary indicator of the influence a social media site has. Facebook’s creation of an algorithm that glorifies misinformation and hyper-partisanship asserts that they were involved in the manipulation of voters within the 2016 presidential election, and therefore influenced the outcome of the 2016 presidential election. 

Using the consequentialism tool Facebook had multiple solutions to preventing their involvement in information warfare and their influence of the 2016 presidential election. Facebook had the opportunity to artificially deflate the trends within their social media site. Facebook should have deflated hyper-partisan posts and inflated neutral posts. This would have provided good consequences by presenting more Facebook users with legitimate posts and mitigated the bad consequences by limiting their exposure to hyper-partisan posts. Another action Facebook could have taken was altering their algorithm. Facebook should have changed their algorithm to reinforce positive engagement, hyper-partisans took advantage of the algorithm’s encouragement of any engagement. This propelled any post with massive engagement to trends, even if a majority of the engagement related to the posts was negative. This would limit exposure to polarizing viewpoints, mitigating the negative consequences associated with radicalized viewpoints. This action would additionally allow posts with positive engagement to trend significantly more than negative ones, promoting positive consequences of these posts such as legitimate factual accuracy. Facebook allowed foreign nations to purchase politically targeted ads on their site. This allowed these nations to influence the 2016 presidential election through Facebook, allowing these foreign nations to subvert the spread of information by artificially overloading the site with misinformation. Facebook should have prevented foreign nations from purchasing politically targeted ads on their site. This would have mitigated the negative consequences of foreign involvement in domestic politics, and encouraged a positive consequence by allowing an uninfluenced perspective when voting. Each of these actions that had the potential to be taken would create a significantly more ethical solution and outcome according to the consequentialism tool. Facebook would not have acted in information warfare, and they would have limited effect on the political outcome of the 2016 presidential election. 

In conclusion, the article by Madrigal showcases the ethical mistakes Facebook made in their attempt to compete with other social media sites. Facebook engaged in information warfare against American voters, and the United States election system. Facebook additionally influenced the 2016 presidential election by acting as a tool for hyper-partisans and foreign nations to spread misinformation. The use of their trend algorithm greatly propelled the exposure of Facebook users to misinformation created by these malicious actors. This directly conflicts with the idea of consequentialism. Facebook had no regard to the larger scope of the consequences related to their algorithm and thirst for engagement on their site. This directly led to the influence of the 2016 presidential election. Facebook had the opportunity to adjust their algorithm at any point in time to limit the misinformation spread. However, Facebook continued to allow the pumping of misinformation on their social media. This was a significant proponent of the recent polarization in American politics. Some may argue that Facebook was working in the best interest of their company to have the most engagement possible on their site. I agree that this resulted in the most engagement possible on their site, however I believe that their sacrifice of integrity as a company has had negative implications on the trust of Facebook users in the long term.

References:

https://www.theatlantic.com/technology/archive/2017/10/what-facebook-did/542502/