{"id":306,"date":"2024-11-18T01:51:34","date_gmt":"2024-11-18T01:51:34","guid":{"rendered":"https:\/\/sites.wp.odu.edu\/hannahklein\/?p=306"},"modified":"2024-11-18T01:51:34","modified_gmt":"2024-11-18T01:51:34","slug":"case-analysis-on-user-data","status":"publish","type":"post","link":"https:\/\/sites.wp.odu.edu\/hannahklein\/2024\/11\/18\/case-analysis-on-user-data\/","title":{"rendered":"Case Analysis on User Data"},"content":{"rendered":"\n<p>Data breaches are an inevitable and a threat to companies or organizations and it is<br>especially a serious threat to people\u2019s privacy and their information. Thankfully, the<br>European Union has created a set of rules that will allow EU citizens to be more in<br>control over their personal data. This set of rules is called the General Data Protection<br>Regulation and these rules apply to any organization that is in the EU or any<br>organization that offers its goods or services to EU citizens or companies. Under these<br>laws, organizations have to make sure that personal data is collected legally and under<br>strict conditions, protect it from misuse and exploitation, and also make sure to respect<br>the rights of the data owners or else face a penalty like a fine. The GDPR also extends<br>the definition of what type of data is considered personal data, like a user\u2019s IP address<br>or their biometric data, to be in line with the digital age. GDPR also requires that<br>organizations notify consumers if their data has been hacked as soon as possible.<br>Consumers are also allowed to access their personal data by seeing how it is being<br>processed, being able to opt out of the company using their data, removing their<br>information from a database, and make organizations explain how they will be using<br>customer information. In this case analysis, I will argue that contractarianism shows us<br>that the United States should follow Europe\u2019s lead in adopting similar privacy laws to<br>better protect user data and deal with organizations that mistreat data that they collect<br>from individuals.<br>In 2008, a group of researchers released data they had collected from US college<br>students who were using Facebook. Despite their efforts to hide and protect any<br>personal information, the data source became identifiable, putting the students\u2019<br>personal information at risk. The privacy protection steps implemented in this database<br>project included several measures. First, university research assistants (RAs) were the<br>ones allowed to access the collected data, provided they had permission to use it.<br>Second, all identifying information was deleted or encoded after the data was<br>downloaded. Third, there was a delay in the release of cultural labels to ensure<br>students&#8217; identities remained anonymous. Fourth, anyone wanting to access any part of<br>the database had to agree to a &#8220;terms and conditions for use&#8221; policy. Finally, the entire<br>research project had to be reviewed and approved by the University\u2019s Committee.<br>While these steps might seem sufficient to protect the privacy of the subjects, Zimmer<br>analyzed each step and highlighted their flaws regarding privacy protection. Students<br>might have been comfortable with RAs viewing their Facebook accounts because they<br>attended the same school and were part of the same network. However, students might<br>not approve of outsiders with no connection to them viewing their pages or information,<br>which could happen once RAs upload the students&#8217; info to the public database. The<br>researchers claimed they removed or encoded information that could identify someone.<br>However, the database still revealed unique attributes that could lead to privacy<br>breaches, such as having a single student from states like Delaware or Montana and<br>identifying students by rare ethnicities like Iranian, Malaysian, Hungarian, or Nepali. The<br>researchers&#8217; decision to delay the release of unique cultural labels by three years does<br>not alleviate privacy concerns; it merely postpones them. These labels will eventually be<br>viewable, compromising privacy in the future. Although the researchers included a<br>&#8220;Terms and Conditions of Use&#8221; agreement to protect students&#8217; identities and prevent<br>malicious use, many users likely skip through these agreements.<br>Even more than that it is unclear how the researchers plan to monitor or enforce these<br>terms. This database project poses a significant risk to the students whose information<br>was collected. Some students might not even be aware that their information is included<br>in the database. Under the General Data Protection Regulation (GDPR), the university&#8217;s<br>database project would have been conducted more ethically. The GDPR would require<br>the research team to notify students before collecting their information from Facebook<br>and explain how their data would be used and protected. Students would also have the<br>right to have their personal data removed or deleted from the database if they desired. If<br>the US had laws similar to the GDPR, the privacy risks faced by these university<br>students would be significantly reduced.<br>Contractarianism, as a moral theory, emphasizes the importance of agreements or<br>contracts among individuals. In the context of Zimmer&#8217;s findings, contractarianism would<br>argue that the researchers had a moral obligation to enter into a fair contract with the<br>students, which would include informed consent, fair terms and conditions, protection of<br>privacy and accountability and recourse. Zimmer&#8217;s critique of the Facebook data release<br>underscores significant ethical shortcomings, particularly in terms of respecting<br>autonomy, ensuring beneficence and non-maleficence, promoting justice, and<br>maintaining transparency and accountability.<br>From a contractarianism perspective, the researchers&#8217; actions in conducting their<br>database project could be seen as immoral because they put many students&#8217; privacy in<br>danger. The GDPR&#8217;s guidelines would help students be more aware of and protected in<br>the use of their data for such projects. It is crucial to set up laws similar to Europe&#8217;s<br>GDPR in the US, as this would provide better protection for individuals&#8217; privacy in<br>various contexts, not just academic research. Ethical research practices must prioritize<br>the rights and well-being of participants, ensuring that all actions are conducted with<br>integrity and fairness. Zimmer&#8217;s findings highlight the need for robust privacy protection<br>measures and the importance of ethical principles in guiding research practices.<br>Buchanan\u2019s study focuses on a big data research model designed to analyze Twitter<br>users and identify accounts that support ISIS\/ISIL. The researchers behind this model<br>examined 119,156 Twitter accounts to create a method for detecting individuals with<br>ties to the terrorist group. This methodology aims to provide intelligence agencies,<br>researchers, and other relevant parties with the tools to investigate ISIS supporters.<br>While the study&#8217;s context and methods are well-defined and reliable, Buchanan raises<br>significant ethical concerns regarding the research methodology. Buchanan questions<br>the potential applications and users of this methodology, along with who has access to<br>and can manipulate the data. He speculates on the implications if the focus shifted from<br>identifying ISIS supporters to targeting groups like Black Lives Matter supporters. This<br>highlights the broader issue that such big data methodologies and algorithmic<br>processes can be used to identify various groups, including frequent McDonald&#8217;s<br>customers, Target shoppers, or political protesters. The researchers involved in this<br>project acknowledge that big data research can serve both marketing and intelligence<br>purposes. This dual-purpose nature means individuals might agree to their data being<br>used for marketing but object to its use for intelligence gathering. In spite of that, big<br>data research typically does not offer a choice, as it inherently reveals information about<br>individuals and their networks without their explicit consent.<br>Buchanan emphasizes the ethical responsibility of researchers to treat participants with<br>respect and protect their well-being. The General Data Protection Regulation (GDPR)<br>supports these ethical considerations by requiring researchers to inform participants<br>about the research objectives and obtain their consent before collecting data. Under the<br>GDPR, the participants in the Twitter big data project would have been informed and<br>consented to their data being analyzed and stored. Moreover, the GDPR mandates the<br>protection of personal data from exploitation or misuse. Organizations collecting data<br>must ensure its security, and in case of a data breach, they are required to inform the<br>affected individuals within 72 hours (about 3 days). Non-compliance with these<br>guidelines results in hefty fines.<br>Adopting similar regulations in the US, Buchanan argues, would be the morally correct<br>approach based on contractarianism, which emphasizes mutual agreement and fairness<br>in ethical decision-making. The current methods of data collection in big data projects<br>are seen as questionable or immoral because they involve using individuals&#8217; information<br>without their knowledge or consent and without guarantees of data security.<br>Implementing laws closely related to the GDPR in the US would grant citizens greater<br>control over their private information. It would ensure that individuals are aware of who<br>has access to their data, how it will be used, and whether it is secure or compromised.<br>Buchanan&#8217;s exploration of big data research, particularly his study on identifying<br>ISIS\/ISIL supporters through Twitter, brings to light significant ethical concerns that are<br>deeply rooted in contractarianism. Contractarianism, especially as articulated by<br>theorists like James Buchanan, emphasizes mutual agreements and fairness in ethical<br>decision-making. Buchanan&#8217;s study underscores the potential misuse of big data<br>methodologies and the necessity of safeguarding individuals&#8217; rights and privacy.<br>Buchanan&#8217;s primary ethical concerns revolve around potential for misuse, lack of<br>consent and dual-purpose nature.<br>To sum it up, Buchanan\u2019s study exposes the ethical dilemmas inherent in big data<br>research, particularly regarding consent and data security. By aligning with<br>contractarian principles, adopting GDPR-like regulations in the U.S. would ensure that<br>big data research respects individual autonomy, secures mutual agreement, and<br>upholds fairness. These measures would mitigate privacy concerns and prevent the<br>misuse of personal information, thereby promoting ethical research practices in the<br>digital age. While the methodology can be effective in identifying ISIS supporters, its<br>broader implications raise concerns about privacy and misuse of personal information.<br>Adopting GDPR-like regulations in the US could address these concerns by providing<br>stronger protections for individuals&#8217; data and ensuring ethical research practices.<br>In conclusion, protecting user data is crucial as it equates to safeguarding an<br>individual\u2019s right to privacy. However, current US laws make it challenging for<br>individuals to shield their data from organizations that seek to collect and store vast<br>amounts of information for various purposes. Many US organizations either lack a<br>commitment to protecting personal information or struggle with proper data security and<br>accountability. To enhance user data protection, the US should adopt laws similar to the<br>GDPR. Such legislation would empower individuals to be informed about how their data<br>is used, decide whether they consent to data collection, and ensure that their data is<br>more securely protected. Implementing GDPR-like regulations would not only grant<br>individuals greater control over their private information but also foster transparency and<br>trust between consumers and organizations. Moreover, these regulations could<br>establish stringent penalties for non-compliance, incentivizing organizations to prioritize<br>data protection and invest in robust security measures. Training employees on data<br>protection best practices and implementing regular audits would further bolster<br>accountability. Ultimately, adopting comprehensive data protection laws would<br>contribute to a more ethical digital ecosystem, where individuals can engage with<br>technology without fear of exploitation. This shift is essential for maintaining public<br>confidence in digital services and fostering innovation while ensuring that user rights are<br>upheld.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Data breaches are an inevitable and a threat to companies or organizations and it isespecially a serious threat to people\u2019s privacy and their information. Thankfully, theEuropean Union has created a set of rules that will allow EU citizens to be more incontrol over their personal data. This set of rules is called the General Data&#8230; <\/p>\n<div class=\"link-more\"><a href=\"https:\/\/sites.wp.odu.edu\/hannahklein\/2024\/11\/18\/case-analysis-on-user-data\/\">Read More<\/a><\/div>\n","protected":false},"author":29799,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","wds_primary_category":0},"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/306"}],"collection":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/users\/29799"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/comments?post=306"}],"version-history":[{"count":1,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/306\/revisions"}],"predecessor-version":[{"id":307,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/posts\/306\/revisions\/307"}],"wp:attachment":[{"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/media?parent=306"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/categories?post=306"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.wp.odu.edu\/hannahklein\/wp-json\/wp\/v2\/tags?post=306"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}