2.4. Case Analysis on User Data
In this article, Palmer elaborates on the implementation of General Data Protection Regulation, also known as GDPR. This regulation service acts as an additional protection blanket to European internet users of the European Union. Palmer states that this new program will allow European Union citizens to have a lot more control over their personal data. Palmer goes on to say that “almost every aspect of our lives revolves around data” (Palmer, 2019). This article mentions that personal data is collected and used for analysis. Most disturbingly, this information is stored by many companies and used for their own benefit. It is well known that breaches happen, and information can be leaked. Often, the people using this information have foul intent and choose to use personal data for corrupt reasons. GDPR is an organization that helps omit the frequency of breach occurrences by enforcing very strict data collection policies and obliging digital companies to respect private data of users. [In this Case Analysis, I will argue that deontology/Kantianism shows us that the United States should follow Europe’s lead because it is in the best interest of the citizens to be protected digitally.
One concept that Zimmer uses in the article, “But the data is already public: on the ethics of research in Facebook” is the use of Research Assistants in the “Tastes, Ties, and Time” project. Zimmer goes on to explain that hiring Harvard’s students to perform Research Assistant jobs allowed the researcher more access that could have been denied. To elaborate, if the Research assistant had a friend on Facebook that had a private account and this RA was “friends” with the proposed subject of research, then that gave the rest of the researching team information that was otherwise private. This allowed for unwanted, personal, and private information by the potential victim to be exploited into unwanted hands. This approach to obtain information is not all pure considering it requires going through multiple steps and people to obtain the partially public information.
When looking at this case through Kantianism or deontology, it is assumed that the use of Research Assistants in a project that requires a lot of personal data was unethical. Considering that Kantianism is a basic set of moral rules that apply to all people, no matter the situation or past, obtaining private information for personal use of research would be found deceitful. In Kantianism, the action is either good or bad depending on the very intent of that action. From what was read in the article, the sole intent of using this private information was simply for research. Unfortunately, there is no way of knowing if that was the sole intent of this research. It is possible that the initial intent was not pure or from a good place. Outside of just gathering the information, “Tastes, Ties, and Time” did not go the extra mile to protect the information of their subjects. In accordance with regulations of GDPR, this would already raise concerns over the authenticity in reasoning of obtaining such personal information without direct consent from the individual. GDPR would look at this form of information gathering as an invasion of personal boundaries and this daring act would potentially result in fines. When information was starting to leak, the T3 company simply disappeared instead of resolving the issue. That would be ethically wrong from a deontological perspective since the company did not have morally good intentions when the issues of information re-identification arose. Since GDPR’s sole purpose is to protect and freedom in privacy, having even a slight concern of reidentification of researched individuals would rise red flags for GDPR since this would directly oppose their agenda.
The fact that this research project had to jump hurdles and contact the university’s administrative team to get information about the students, such as their email addresses, without consenting the students personally shows lack of respect for the people being researched. Having to go to such an extent without consent from individuals implies a selfish and deceitful intent to use such data. If it was such a simple and harmless task, why go through someone else instead of asking the individual personally to use their private and personal information? I believe this was avoided in fear of negative feedback and refusal of personal data being used. It would have been righteous and Kantainian for the “Tastes, Ties, and Time” Project to have obtained personal and individual consent from each subject and providing them a full explanation of how and why their information is being used. Yet, this company tried to give false hope by reassuring the public that information was going to be hard to hack. When it became obvious to the public that the information was easily accessible with a little bit of work, “Tastes, Ties, and Time” made the
most unethical decision to stay silent and almost disappear from existence while the “wind blew over”. This is the most immoral way to address an issue they created. According to the regulations of GDPR, this disrespectful and irresponsible approach to individual privacy would be fined. GDPR focuses in on a company being transparent and fair and this company was neither transparent nor fair when it came to explaining how the individual’s information would be used. In a Kantainian world, T3 could have explained their pure intentions, asked for forgiveness, and moved on with a clean conscience.
Buchanan in her article, “Considering the ethics of big data research: A case of Twitter and ISIS/ISIL” explores the ethical reasoning for using user data to find ISIS followers. In this article, Buchanan questions the concept of ethical boundaries of each individual’s data when used by big data research. Buchanan goes to explain that this information that is publicly accessible is used in various ways to identify and analyze certain trends and potential threats. Buchanan urges readers to analyze the doings of big data tech companies and determine if they are doing the ethical thing by making humans just a set of data points and data subjects? She invites readers to explore the fact that big tech uses all the personal information we provide for their own purposes that are not exposed completely to users. It was also mentioned that users do not have the option to consent or opt-out of this research and analytical studies. Buchanan also mentions that big data tech does not fully expose how they use their researched information and how the person researched might be affected. In light of GDPR’s policies and regulations, this article helps raise awareness to the amount of information that is exposed to the public. It shows that without regulations and groups like GDPR, companies are audacious and relentless in using personal
information for their own agenda. With GDPR, companies would be held accountable to provide transparency for the information they use and allow individuals to consent to it’s use.
Looking at this situation from the view of deontology and Kantianism, big data would also be ethically incorrect in their actions. Having personal information and using it for their own intentions without revealing it to the users would mean they are not doing this to protect the individual’s integrity and respect their choice to privacy. Big data also mentions that obtaining consent would be impractical since they are using publicly accessible data. That being said, deontology means acting from a pure intent. Big data does not reveal what all exactly they use this public information for and how it will ultimately affect the researched individuals. They also identify certain trends and can see the recruiters and sympathizers of terrorist groups which can be a public safety concern as Buchanan mentioned in her article.
That being said, I believe it is still the right of private individuals to consent and know that their information is being used. Diving into this topic with Kantianism as a prime rule of morality, this is an ethically immoral way of exploiting personal information. Just because personal information is accessed publicly, does not mean it is meant for the public to use, analyzation, or storage. Not holding companies responsible is the type of gray boundary between public and private information that is a direct ethical violation of GDPR. This dauntless act of using public information removes any form of dignity or respect and turns a human being into a digital number. As Buchanan mentions, this ultimately turns a real living being into simply a data subject; a live tool for research. Considering that most people prefer their data and personal information to be secured, using it without their consent would violate their privacy and make this an ethically immoral way of conducting business, even if it concerns public safety. I believe that big tech has crossed the ethical reasoning for using personal data for their own gain, be it for safety, business, marketing, or simply general knowledge. Human information is not there for just any reason to be used as one pleases, it is a way for people to connect and stay in touch for their own private use. I believe that big tech needs to include an informed consent in their user agreements or just deny access to those who refuse. It is very important to be aware of what is being done with user’s information and denying them that simple right is ethically wrong because it is not of pure intent. These companies should be required to expose their plan for the usage of data information and a regulatory system like GDPR would ensure that.
To conclude, I believe that users must be informed about the ways their information is being used- be it research, a simple graph, a public safety concern, or just general knowledge. My argument is that the United States should implement more of the privacy rules and regulations such as those in the European Union. This will provide users with respect, dignity, and honesty from the very beginning. Considering that Kantianism and deontology focus on the initial intent of the action, then using personal information for one’s own purpose cannot be a pure and good intention when the user is not being consulted and informed. If on the contrary, users are not informed it is possible that since they provided this information publicly (they can always stay off-line), it could be used against them in research and other necessities of big tech and personal researchers. Unfortunately, this concept will only hurt the vulnerable individuals and expose unnecessary private information publicly. Considering the public safety aspect, most terrorists will not publicly announce their next actions and will do their best to stay off of social media so researching or collecting such data is simply an excuse for unrelated reasons to use private information. Complete social and internet privacy could alter or deter in a case such as the law enforcement using social media for identifying criminals. Overall, private information that is publicly accessible should be used only when the intent is clear and pure, the user consents, the information is only used in efforts to protect, respect, and preserve dignity of the users.