{"id":308,"date":"2024-11-18T01:53:16","date_gmt":"2024-11-18T01:53:16","guid":{"rendered":"https:\/\/sites.wp.odu.edu\/hannahklein\/?p=308"},"modified":"2024-11-18T01:53:16","modified_gmt":"2024-11-18T01:53:16","slug":"case-analysis-on-information-warfare","status":"publish","type":"post","link":"https:\/\/sites.wp.odu.edu\/hannahklein\/2024\/11\/18\/case-analysis-on-information-warfare\/","title":{"rendered":"Case Analysis on Information Warfare"},"content":{"rendered":"\n<p>In Alexis C. Madrigal\u2019s article, \u201cWhat Facebook Did to American Democracy,\u201d the focus<br>is on Facebook&#8217;s role in political campaigns, particularly the 2016 presidential election.<br>The article examines how Facebook employs algorithms to deliver targeted ads to<br>individual users based on their interactions\u2014such as likes, comments, and shares\u2014on the<br>platform. Madrigal argues that the use of this social media tool contributed to skewing the<br>election in favor of one party over the other. Additionally, the article explores how Russia<br>allegedly \u201cinfiltrated\u201d Facebook and other social media sites to conduct disinformation<br>campaigns during the election. In this Case Analysis, I will contend that, from a contractarian<br>perspective, Facebook did not participate in information warfare as it remained neutral<br>towards both parties. Furthermore, Facebook should not be held accountable for the election\u2019s<br>outcome since it is not responsible for the actions of foreign adversaries who attempted to<br>exploit its platform. This analysis will argue that Facebook&#8217;s role was limited to providing a<br>platform rather than influencing or controlling the content disseminated on it.<br>Using Keith Scott\u2019s paper, \u201cA Second Amendment for Cyber? \u2013 Possession, Prohibition,<br>and Personal Liberty for the Information Age.\u201d I will explain how Facebook should not be held<br>accountable. This paper discusses the shifting dynamics of cyber connectivity in an era where<br>nearly everyone has access to the internet with minimal restrictions. Scott suggests<br>implementing standardized testing or certification to ensure safe internet use, but such<br>measures could undermine personal freedoms, particularly given that internet access is often a<br>paid service.<br>Scott highlights a significant concern: the proliferation of software that allows for the<br>manipulation of audio and video, making it possible to create fabricated content that<br>appears authentic. He notes, \u201cFrom the standpoint of Information Warfare and \u2018fake<br>news,\u2019 the open distribution of software which allows the editing of sound and video to in<br>effect make anyone say and do anything to and with anyone (Suwajanakorn et al.,<br>2017; Farokhmanesh, 2018) means that politics, bullying, and cyberstalking are going to<br>become even darker and crueler than the present.\u201d This capability to manipulate media<br>content is not unique to Facebook; it affects all digital platforms and reflects broader<br>media practices.<br>The media has long been adept at editing and sensationalizing stories to capture<br>attention, irrespective of the platform used for dissemination. The problem of content<br>manipulation and misinformation extends beyond any single platform and is deeply<br>rooted in media practices as a whole. Thus, while Facebook provides a space for<br>content sharing, it is not fundamentally responsible for the inaccuracies or<br>manipulations that occur within the broader media landscape. The challenge lies not in<br>the platform itself but in the pervasive nature of media manipulation and the<br>responsibility of users to critically assess the information they encounter.<br>Media platforms should embrace a contractarian approach to content management,<br>applying reasonable standards. While it\u2019s appropriate to censor explicit content to<br>protect younger viewers, the broader principle involves assessing whether the overall<br>distribution of benefits and risks is equitable. This is the foundation of Facebook\u2019s<br>content policies and a reasonable framework to follow. Ultimately, it is up to individual<br>users to scrutinize and evaluate the content they encounter, rather than having<br>Facebook decide for everyone.<br>Scott highlights that in a democratic society, citizens cannot be forced to give up their<br>freedoms, even if those freedoms may be risky. However, they can be persuaded<br>through clear and compelling messages to make sacrifices for the greater good. This<br>view is subjective because what one person perceives as dangerous might not be seen<br>the same way by others. This underscores why Facebook should not be held directly<br>accountable for the outcomes of the 2016 election. Even if foreign entities like Russia<br>attempted to influence content, it is up to users to critically evaluate and decide on the<br>validity of the information they see. Facebook\u2019s role is not to filter or judge content for its<br>users but to provide a platform where individuals can make their own informed<br>decisions.<br>In analyzing why Facebook did not directly influence the results of the 2016 presidential<br>election, I will draw on Lt. Col. Jarred Prier&#8217;s paper, \u201cCommanding the Trend: Social<br>Media as Information Warfare.\u201d Prier\u2019s research delves into how U.S. adversaries,<br>particularly Russian operatives, have manipulated social media to spread<br>disinformation. He provides detailed accounts of Russian activities, including hacking,<br>espionage, and strategic manipulation during the election. These efforts involved using<br>social media to propagate \u201cfake news\u201d and misleading information with the intent of<br>influencing electoral outcomes. Despite the potential effects of these disinformation<br>campaigns, Facebook itself did not engage in actions designed to sway the election for<br>either political side. Instead, Facebook\u2019s primary function is to serve as a platform that<br>enables users to connect, share, and promote their businesses. It is designed to<br>facilitate communication and interaction, not to scrutinize or validate the accuracy of<br>every piece of content shared on the platform.<br>Ensuring that the platform operates effectively and remains available for diverse uses is<br>Facebook\u2019s core responsibility. The platform&#8217;s role is not to act as a gatekeeper of<br>content or to oversee the truthfulness of information disseminated by users. While it is<br>crucial for Facebook to address and manage misuse of its platform, it is ultimately up to<br>users to critically evaluate the content they encounter. The broader issue of<br>misinformation involves systemic media practices and user responsibility rather than<br>being solely attributable to the platform itself.<br>According to a Facebook spokesperson, \u201cWe as a company are neutral \u2013 we have not<br>and will not use our products in a way that attempts to influence how people vote.\u201d This<br>assertion underscores Facebook&#8217;s commitment to maintaining neutrality, ensuring that its<br>platform is not manipulated to unfairly benefit any political party. The company\u2019s stance is that<br>it should offer a neutral environment where users have the freedom to share and engage with<br>content without undue influence.<br>In contractarian terms, Facebook operates under a principle akin to a \u201cveil of<br>ignorance,\u201d meaning the company\u2019s internal views on political matters are kept hidden,<br>thereby preventing any potential bias from impacting users. This approach ensures that<br>while there might be inherent biases or operational tendencies within the platform,<br>Facebook\u2019s fundamental role remains that of a neutral facilitator of interaction and<br>content sharing.<br>Additionally, Facebook\u2019s neutrality is pivotal in its operational philosophy. While the<br>company does implement measures to address misinformation and harmful content, it is<br>not equipped to oversee the full spectrum of political influence that may occur through<br>its platform. The company\u2019s responsibility is to provide a space where users can freely<br>exchange ideas and information. Users are tasked with critically assessing the content<br>they encounter, rather than relying on Facebook to moderate or validate every piece of<br>information.<br>Moreover, Facebook\u2019s role extends beyond mere content hosting; it involves creating an<br>environment where diverse perspectives can coexist. The challenge of misinformation<br>and political manipulation is not unique to Facebook but is a broader issue within the<br>media landscape. By upholding its commitment to neutrality, Facebook aims to support<br>an open forum for democratic discourse while recognizing that the responsibility for<br>discerning truth and forming political opinions ultimately lies with the users themselves.<br>Any website is vulnerable to infiltration by external entities, which is a characteristic of<br>our digital era. Prier\u2019s paper addresses several strategies used to disseminate<br>disinformation on social media, such as trend distribution, trend jacking, and trend<br>creation. These methods allow individuals with malicious intent to saturate the platform<br>with targeted, misleading content. Although Facebook makes efforts to limit the spread<br>of such disinformation, it is a technology platform that cannot entirely prevent all forms<br>of digital manipulation or attacks. The complexity and scale of modern digital<br>ecosystems make it exceedingly difficult to eliminate every instance of malicious<br>activity. Eric Hoffer\u2019s observation that \u201cpropaganda on its own cannot force its way into<br>unwilling minds, neither can it inculcate something wholly new\u201d underscores the<br>limitations of any platform in controlling the content it hosts. While Facebook can<br>implement measures to address disinformation, it ultimately remains a tool for users,<br>and the responsibility for discerning and verifying the accuracy of information lies with<br>individuals. Facebook\u2019s primary duty is to maintain an operational platform, not to act as<br>a content censor or arbiter of truth. As such, users must navigate and critically evaluate<br>the content they encounter, understanding that the platform\u2019s role is to provide access<br>rather than enforce content accuracy.<br>In summary, while there is evidence that outside actors sought to influence the election<br>by promoting one side more aggressively, similar efforts could be employed by any<br>political faction. Although these manipulative tactics took place on Facebook, it is not<br>the platform\u2019s direct responsibility to verify the authenticity of every ad or piece of<br>content. While Facebook might develop and implement more sophisticated algorithms<br>to flag misleading information in the future, such technologies could remain inherently<br>subjective and unreliable.<br>Furthermore, it is crucial to recognize that social media platforms like Facebook are<br>designed to facilitate free speech and expression rather than to act as arbiters of truth.<br>The primary responsibility lies with individuals to approach content with critical thinking<br>and not to rely solely on social media for guidance or validation. We must actively<br>engage in discerning the credibility of information and make informed decisions based<br>on our own judgment.<br>Ultimately, while platforms can and should take steps to mitigate misinformation, the<br>onus of evaluating content and forming political opinions rests with us as users. Social<br>media serves as a tool for communication and expression, but it is our duty to maintain<br>intellectual autonomy and not let these platforms shape our perceptions or decisions<br>unduly.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In Alexis C. Madrigal\u2019s article, \u201cWhat Facebook Did to American Democracy,\u201d the focusis on Facebook&#8217;s role in political campaigns, particularly the 2016 presidential election.The article examines how Facebook employs algorithms to deliver targeted ads toindividual users based on their interactions\u2014such as likes, comments, and shares\u2014on theplatform. Madrigal argues that the use of this social media&#8230; <\/p>\n<div class=\"link-more\"><a href=\"https:\/\/sites.wp.odu.edu\/hannahklein\/2024\/11\/18\/case-analysis-on-information-warfare\/\">Read More<\/a><\/div>\n","protected":false},"author":29799,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","wds_primary_category":0},"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/308"}],"collection":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/users\/29799"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/comments?post=308"}],"version-history":[{"count":1,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/308\/revisions"}],"predecessor-version":[{"id":309,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/308\/revisions\/309"}],"wp:attachment":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/media?parent=308"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/categories?post=308"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/tags?post=308"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}