{"id":234,"date":"2023-04-23T15:37:06","date_gmt":"2023-04-23T15:37:06","guid":{"rendered":"https:\/\/sites.wp.odu.edu\/ksers001\/?page_id=234"},"modified":"2023-04-23T15:37:10","modified_gmt":"2023-04-23T15:37:10","slug":"cases-analysis-on-data","status":"publish","type":"page","link":"https:\/\/sites.wp.odu.edu\/ksers001\/sample-page\/law-and-ethics-phil-355e\/cases-analysis-on-data\/","title":{"rendered":"Cases Analysis on Data"},"content":{"rendered":"\n<p><br>The industry around data grosses billions of dollars annually. Data on people around the world is<br>used in numerous ways such as research, marketing, surveillance and more. In 2018 the General Data<br>Protection Regulation or GDPR for short was put into effect across the European Nations. This was done<br>to help protect individuals\u2019 privacy and protect their personal data from misuse and improper handling.<br>This gave EU citizens more control over data such as names, pictures, biometric data as well as other<br>data that can be used to identify an individual. Till then users did not have much control over how their<br>data was used once it became public. The new regulations also set forth standards on how controllers<br>and processors of data were responsible for any misuse or poor handling. Controllers and processors of<br>data are help to fairness and transparency standards. This was a monumental achievement for the<br>citizens of the EU, but over seas in the United States such federal regulations do not exist or are not as<br>protective. In this Case Analysis I will argue that contrarianism shows us that the United States should<br>follow Europe\u2019s lead because users in the United States may be unaware of how their data is being used<br>or that it has been compromised.<br>In a research paper written by Michael Zimmer titled \u201cBut the Data is Already Public\u201d (Zimmer)<br>we can see the ethical conflicts when data is either used unintentionally or used to partially re identify<br>users of a data set. In this paper it explains how datasets, collected off Facebook accounts from a<br>specific college were publicly released. The team that released these datasets made attempts to conceal<br>the identities of the users to protect privacy. However, despite these efforts, the users were re identified<br>with the anonymous data provided. Part of the efforts made by the research team who released the<br>data created a TOU, or \u201cTerms of Use\u201d agreement that specifically prohibited the use of this data to<br>reidentify the people. Specific details of these datasets such as a list of college majors were used to<\/p>\n\n\n\n<p>identify the specific college these records came from. Granted PII was not part of these datasets, the<br>students were still partially identified. The research team removed the data from public view shortly<br>after re identification took place. The team did not face any repercussions other than scrutiny. Under<br>the GDPR, this would be considered a breach and the research team that released the data would have<br>been help responsible for its misuse. Under the GDPR it is the responsibility of the controllers and<br>processors to maintain ensure that data is used properly and that the users who provided the data fully<br>understand and are properly notified how the data will be used with no vagueness.<br>\u201cIn July of 2021, European regulators in Luxembourg fined Amazon Europe a whopping $877m<br>fine for data breaches and failing to comply with general data processing principles under GDPR.\u201d (Clark)<br>Fines like the one Amazon received in July of 2021 should give controllers and processors of data a<br>larger incentive to better protect the privacy and security of user data. Not all companies face this level<br>of repercussions for data that is improperly handled, but if an instance occurs, the companies<br>responsible will be held accountable for their actions or lack there of. Not only does this apply to data in<br>Europe, but any business that handles data of Europe citizens even outside its borders. \u201cThis means the<br>reach of the legislation extends further than the borders of Europe itself, as international organizations<br>based outside the region but with activity on &#8216;European soil&#8217; will still need to comply.\u201d (Palmer)<br>Using the ethics of Contrarianism, specifically under a \u201cVeil of Ignorance\u201d a concept proposed<br>by John Rawls can show us how the United States should adopt these same practices because it is only<br>fair for all of society, regardless of where you stand. Would the research team make a better \u201cBest<br>effort\u201d at concealing identifiable information in the datasets? Perhaps the data would have never been<br>made public in the first place. Not knowing what side you\u2019re on, the side of the controllers and<br>processors, or the side of the end user could help us adopt rules that are fair for everyone. Its only right<br>that users fully understand what data will be collected, how it will be used and if a breach occurs, that<br>they are notified immediately to take proper actions. This also falls under \u201cjustice as fairness\u201d (Rawls)<\/p>\n\n\n\n<p>when proper responsibility and actions are taken against the misuse of data to create stronger<br>incentives for protecting data fully. Users provide data, and that data is indeed public, but that does not<br>mean that it can just be used in any way people want. It\u2019s the responsibility of controllers and<br>processors to make better efforts in informing the public exactly how their data is used. This would<br>create a strong balance between consent and use of data.<br>These rules do not only apply to commercial business, educational institutions, and research<br>teams. Elizabeth Buchanan writes in her paper titled \u201cConsidering the ethics of big data research: A case<br>of Twitter and ISIS\/ISIL\u201d about how a model called \u201cIterative Vertex Clustering and Classification\u201d or IVCC<br>for short was used to identify ISIS and ISIL sympathizers and community members on twitter. \u201cThis<br>method enables greater detection of specific individuals and groups in large data sets, with its enhanced<br>capabilities to identify and represent following, mention, and hashtag ties.\u201d (Buchanan) The ethical<br>concern behind this model is that even though the data is public and produced by the individuals<br>themselves it makes it difficult to \u201cprotect individual liberties\u201d (Buchanan). A good example of this is<br>when whistleblower Edward Snowden reveled that the National Security Agency had a history of<br>unconventional surveillance methods that created many ethical concerns and how they can affect the<br>constitutional rights.<br>When using social media platforms, users must agree to a Terms of Service agreement. Terms of<br>Service agreements set forth by platforms like Twitter, Facebook, Google etc. have a history of being<br>lengthy, hard to understand and are generally not read before accepted. Generally, a TOS or TOU<br>agreement covers use of content, ownership of content and the platforms access to personal<br>information. There are many factors in why people do not read or fully understand TOS agreements<br>before accepting them. Generally, the length and complexity are an issue. Some assume safety in the<br>fact that millions of other users also use the platform with no concern. Sometimes people find that the<br>benefit of the platform\u2019s services outweigh their privacy concerns. The problem in this lies in how data is<\/p>\n\n\n\n<p>collected and used after agreed to and the constant changing of these agreements. \u201cOne may implicitly<br>agree to one\u2019s data sources being used for marketing purposes while that same person would not want<br>their data used in intelligence gathering. But big data research does not necessarily provide us with the<br>opportunity to consent to either use, regardless of the intent.\u201d (Buchanan) This creates an unfair ethical<br>dilemma for the users of these platforms. Companies like Twitter are less than transparent about how<br>data of its users is collected and used in general. Though the GDPR does not specifically mention Terms<br>of use, service, or conditions in its regulations, it does require that data is used and collected for<br>purposes specified and that it is used only in the manner made transparent to the public and the end<br>user. TOS agreements may not be mentioned, but privacy policies are a must, and these can overlap one<br>another. \u201cA Privacy Policy is required by the GDPR and other privacy laws in order to protect users and<br>ensure proper business practices by website owners and app developers.\u201d (Bass) Furthermore, if that<br>data is used in a way not specifically intended, it is the responsibility of the company to notify its users<br>of the \u201cbreach\u201d in a timely manner.<br>How does this relate to the theory of contrarianism and the \u201cVeil of Ignorance\u201d. It all relates<br>back to fairness for all. Users of these platforms should understand that they have no reasonable<br>expectation to privacy, and many are aware that their data can be mined and used in ways not intended,<br>but they are not given much choice in the matter of how the data is used. It is important and only fair<br>that users understand how their data is being collected and used. The GDPR helps EU citizens in this<br>aspect and adopting similar regulations in the United States could not only benefit the end user of these<br>platforms but also protect companies like twitter by establishing trust with its users through its<br>transparency by creating baselines for user privacy.<br>Using contrarianism and Rawls \u201cVeil of ignorance\u201d I find it fair not only to citizens of the<br>United States but also companies to benefit from regulations like the GDPR. Users will benefit from a<br>fair level of transparency and not need to rely only on an ambiguous \u201cBest effort\u201d to protecting their<\/p>\n\n\n\n<p>personal data. Additionally, this can benefit businesses in many ways as well. Businesses can earn the<br>trust of their users by providing exactly how data will be used and process and when misused or<br>breached can take the proper actions to notify users in a timely manner. It may be difficult for<br>companies to adopt principles like this, but many have already made conscious decisions to shift most of<br>their focus on end user privacy. A set of federal regulations in the United States can protect companies<br>by giving them a clear understanding of the guidelines they need to meet so they are not subjected to<br>hefty fines and repercussions from the poor handling of data. It will give business a better idea of how to<br>be compliant and avoid massive financial loss, as well as the loss of users on its platform brought on by<br>user mistrust. This will not be an easy effort. Many businesses may face challenges constructing and<br>implementing new guidelines to comply with privacy, but the benefits of this outweigh the cost. There<br>will be users who may leave platforms and withdraw consent. This may seem problematic if users of<br>these platforms simply left, as many companies rely on the mass number of users and data to generate<br>revenue. It will not be an easy effort to address all these concerns, but it could improve the relationship<br>between end users and businesses. With a certain level of mandatory transparency, many might feel<br>safer in using these platforms who previously left.<\/p>\n\n\n\n<p>Works Cited<br>Bass, Ross. \u201cWill the GDPR Affect Your Terms and Conditions Agreement?\u201d TermsFeed, 15<br>June 2018, www.termsfeed.com\/blog\/gdpr-terms-conditions\/. Accessed 12 Feb. 2023.<br>Buchanan, Elizabeth. \u201cConsidering the Ethics of Big Data Research: A Case of Twitter and<br>ISIS\/ISIL.\u201d PLOS ONE, vol. 12, no. 12, 1 Dec. 2017, p. e0187155,<br>https:\/\/doi.org\/10.1371\/journal.pone.0187155.<br>Clark, Kendra. \u201cGoogle\u2019s $400m Penalty and Impact of the 5 Heftiest Data Privacy Fines on<br>2023 Ad Plans.\u201d The Drum, 15 Nov. 2022, www.thedrum.com\/news\/2022\/11\/15\/googles-<br>400m-penalty-the-impact-the-5-heftiest-data-privacy-fines-2023-ad-plans.<br>Palmer, Danny. \u201cWhat Is GDPR? Everything You Need to Know about the New General Data<br>Protection Regulations.\u201d ZDNet, ZDNet, 17 May 2019, www.zdnet.com\/article\/gdpr-an-<br>executive-guide-to-what-you-need-to-know\/.<br>Zimmer, Michael. \u201c\u201cBut the Data Is Already Public\u201d: On the Ethics of Research in Facebook.\u201d<br>Ethics and Information Technology, vol. 12, no. 4, 4 June 2010, pp. 313\u2013325,<br>link.springer.com\/content\/pdf\/10.1007%2Fs10676-010-9227-5.pdf,<br>https:\/\/doi.org\/10.1007\/s10676-010-9227-5.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The industry around data grosses billions of dollars annually. Data on people around the world isused in numerous ways such as research, marketing, surveillance and more. In 2018 the General DataProtection Regulation or GDPR for short was put into effect across the European Nations. This was doneto help protect individuals\u2019 privacy and protect their personal&#8230; <\/p>\n<div class=\"link-more\"><a href=\"https:\/\/sites.wp.odu.edu\/ksers001\/sample-page\/law-and-ethics-phil-355e\/cases-analysis-on-data\/\">Read More<\/a><\/div>\n","protected":false},"author":24633,"featured_media":0,"parent":221,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"_links":{"self":[{"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/pages\/234"}],"collection":[{"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/users\/24633"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/comments?post=234"}],"version-history":[{"count":1,"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/pages\/234\/revisions"}],"predecessor-version":[{"id":237,"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/pages\/234\/revisions\/237"}],"up":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/pages\/221"}],"wp:attachment":[{"href":"https:\/\/sites.wp.odu.edu\/ksers001\/wp-json\/wp\/v2\/media?parent=234"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}