May 25, 2018, the European Union passed the General Data Protection Regulation (GDPR). This legislation was created to transfer the control of personal data back to the users, who are ultimately the data owners. The General Data Protection Regulation holds any organization operating within the European Union and outside of the European Union to be in compliance with following the very strict guidelines surrounding data collection, holding organizations far and beyond the European Union’s borders accountable for their citizens personal data. The GDPR is enforced with hefty penalties, which can go up to the billions for some organizations as it is two to four percent of the organization global turnover. (Palmer, 2019).
The European Union broadly defines personally identifiable information as “any information relating to an identified or identifiable natural person.” An identifiable person is defined as someone that can be identified directly or indirectly by an identification number or other identifiable information related to physical, physiological, mental, social, economic, or cultural identity (p. 319, Zimmer). “In this Case Analysis I will argue that the ethical principle of Ubuntu shows us that the United States should follow the European Union’s lead because their much more inclusive legislation is in alignment with acknowledging humanity and that data is actually attached to a human being that belongs to community.
In Zimmer’s case review over four years data was collected from Harvard undergraduate students Facebook profiles, then the Taste, Time, Ties study posted their analysis of the student’s data online publicly. The analyst claimed that the information had been “cleaned” to prevent any breaches of privacy. However, that claim came to not be true as reidentification was proven causing the release of the Harvard student’s information to be compromised. The students are what the GDPR would consider an identifiable natural person because reidentification was capable prosing a threat to the student’s physical, physiological, mental, social, economic, or cultural identity.
Zimmer uses the Kantian dignity based theory to argue that there are ways that organizations can cause harm to its users even if recognizable harm is not caused by disregarding our dignity as autonomous beings. He further explains the Utilitarian harm based theory that the wrongness of privacy breaches stems from the harm they bring to people. By the Taste, Time, Ties research team failing to address their knowledge gaps on the topic of privacy and failing to fully consider the privacy implications of their study they did undermine the owner of the data’s autonomy by losing control of the data which put the students at risk for a personal data breach. From a harm based perspective the wrongness of the T3 research team stems from the violation of privacy that could have occurred if someone cracked the data set to identify who’s who and what further damage could have been done with using the information from the study to access more personally identifiable information. Implementing the ethical perspective of Ubuntu, I believe that the T3 analyst should have considered the subjects as humans and not sets of data to be analyzed. The students are humans that are part of a shared community, if the research was carried out their research with more empathy and awareness for the students experiences as beings with in the community, I believe it would have resulted in a more humane and informationally secure outcome. If the General Data Protection Regulation were in effect in the U.S. during the T3 research, I think the analyst would have been more considerate that the date is attached to a person. The GDPR reinforces this idea by directly addressing in the text of the law that it is protecting the citizens of the community. As a community our rights are only achieved through belonging to a group and mutual recognition of each other as human beings sharing this experience.
Buchanan argues that the intent of analysis and expectations of privacy should be considered simultaneously during research. However, currently organizations that collect big data do not give us options to consent to different types of use, user agreements take more of a blanket approach covering all uses to use the platform. The Taste, Time, Ties study failed to meet the expectations of privacy. Some users took steps to increase their privacy by updating their profile privacy settings to only be viewed in network. While yes, the information was public to some it was not public to all. When the T3 initially posted their research, it could be accessed by anyone that agreed to the user agreement, essentially making profile information that was once only visible to some visible to all and able to be linked to the students through reidentification of the public data set breaching the expectations of the student’s privacy. While the study did intend to maintain the privacy of the students the expectation of privacy was not met throughout the study.
If the United States had similar laws to the General Data Protection Regulation in place the T3 study would have had a data protection officer in place (DPO). The role of the DPO is to be the GDPR authority or enforcer of organization. The DPO would have ensured that analysts were properly trained on GDPR compliance guidelines, complete regular audits to verify GDPR compliance is being executed, keep records of all data processing activities completed by the analyst, responds to the students to inform them about how their private information is being used and what safeguards the study has put in place to protect their private profile data. Using the ideas of Ubuntu infers that the GDPR recognizes our freedom to control our personal data as a community and actively designates a member of the community (DPO) to enforce this idea also showing that we even gain our right to privacy through being a shared community and one another. The GDPR also considers humanity by keeping people involved in the process of data collection by allowing people to opt out and keeping them informed on how their data is being managed in an understandable way.
The T3 study disregarded the subjects as beings by not keeping them involved in the process of the collection of their data once the subjects agreed that ended the communication for the most part. The T3 study simply acknowledged the students as data but not humans, which shows in dehumanizing way their privacy was cared for regardless of intent. I would have to agree with Buchanan that currently we as people have become data sets and subjects which is moving us further from humanity.
In closing the United States should follow the lead of the European Union when it comes to data collection legislation because Europe has established legislation that gives the power over data collection back to the community. The General Data Protection Regulation first and foremost acknowledges that data is connected to people. The legislation centers around the people of the community and includes members of the community to uphold our data freedoms by making sure that organizations do not fail to meet the guidelines of the GDPR to reduce the negative consequences and it enforces the idea that we are only granted our rights through mutually recognizing individual within the shared community.
The ethical perspective of Ubuntu highlights the lack of humanity considered in the Taste, Ties, and Time study. This was shown in how the Analyst disregarded the expectations of the student’s privacy when publishing the study openly to the public harming the students by undermining their autonomy as human beings under current United States laws and University regulations. Eventually, the dataset was removed by the authors in agreement with Harvard. While I think the GDPR is a great all encompassing law for data protections I do think there are some drawbacks such a being restricted only to sites compliant with the GDPR, I would feel this would cause concerns with information access being limited or is it just the price we have to pay if we are truly want more control over our personal data.
References
Buchanan, E. (2017). Considering the ethics of Big Data Research: A case of twitter and Isis/ISIL. PLOS ONE, 12(12). https://doi.org/10.1371/journal.pone.0187155
Koch, R. (Ed.). (2023, September 14). What are the Data Protection Officer roles and responsibilities? GDPR.eu. https://gdpr.eu/data-protection-officer-responsiblities/
Narayanan, A., & Shmatikov, V. (2010). Myths and fallacies of “personally identifiable information.” Communications of the ACM, 53(6), 24–26. https://doi.org/10.1145/1743546.1743558
Palmer, D. (n.d.). What is GDPR? everything you need to know about the new General Data Protection Regulations. ZDNET. https://www.zdnet.com/article/gdpr-an-executive-guide-to-what-you-need-to-know/
Zimmer, M. (2010). “but the data is already public”: On the Ethics of Research in Facebook. Ethics and Information Technology, 12(4), 313–325. https://doi.org/10.1007/s10676-010-9227-5







Leave a Reply