Alexander Conrad
Professor Nicol
Phil 355E
1 October 2023
2.4 Case Analysis on User Data
The General Data Protection Regulation is a set of policies set in place by the European Union that was initialized in 2018. These policies are set to strengthen the control over data protection on the internet and lets the EU citizens have more control over their own personal data that is being shared online. With the inevitability of data breaches on the internet in various organizations, they have to inform the customers if their data was tampered with or stolen otherwise the company has to pay a fine. In this Case Analysis I will argue that consequentialism shows us that the United States should follow Europe’s lead because in the future having the whole internet under this regulation will lead to a safer place to have your data and more consumer friendly to people who want to access their information. Zimmer talks about the ethics of the research of data protection in Facebook.
In the reading by Zimmer, “But the data is already public”: on the ethics of research in Facebook, Zimmer talks about the central concept regarding public data and the ethics of collecting it and selling to third parties. The ethics of taking Facebook accounts of college students and collecting that data then trying to hide the more personal identity of the institution still led to people finding the information through the source of the data putting the student’s privacy at risk. Using the consequentialist tool for the moral and ethic reasoning, we can deduce that it would be ethically incorrect for companies to fail at protecting your data and not telling you that it’s been hacked or tampered with. This case study took place back in 2010 before the concept of the General Data Protection Regulation was created in Europe. Facebook is a United States company and if the entire platform was put under this regulation in addition to other platforms, then the customers would be able to access their data easier and in a safer way. As well as the company Facebook would have to pay the General Data Protection Regulation a fine if they were to lie to the customers that their data wasn’t being stolen or sold to third parties. The consequentialist tool with the moral reasoning that this would be for the better proves that the United States should use this regulation as a way to track the customers own private information and can access or change needed items in case of an attack. How companies already use users’ information to be sold or for algorithms can still be done, but with these added policies to help the average user as well as be open with what happens with information. Companies have to legally ask for your information already, but with these added regulations it adds another security layer for the users themselves with their private information.
Zimmer also points out in his paper that the justification for why the company was not as worried about this information was that it was already public information and nothing private was used to figure out the identities and privacy risk of the university students. It would be ethically wrong to just let people’s privacy get risked because their information was traced in a way or the company’s data servers were attacked by bad actors. These attacks happen and companies have to be prepared for them, however any information that gets leaked or stolen is typically information the users do not use publicly such as identifiable information like names, credit cards, social security and more. Some users can be more open and share their private information, but the average user does not do this so when even the simplest amount of information can be traced back to them like their behaviors, search history, or more it can be devastating to the user’s privacy as well as the company’s reputation. Zimmer brings up a point alongside this argument about what the attackers can potentially do with this information that could have otherwise been altered or fixed if the users knew about it. This information could be used to find out more personal information or could be sold on the black market to be used by more bad actors. This information could also be used to be mined or made in algorithms for good uses, but could be misused if the users don’t agree to their information being used for these purposes.
Buchanan makes points regarding the ethics of user’s information being mined or used in algorithms and how users should be fighting to protect their liberties. He explains that it is a long going battle for users around the world and their privacy. Essentially companies like Twitter (X) uses the user’s information to mine and be put in algorithms in order to provide the users with a better experience and interface. The specifics of an algorithm is for the data to be put in a certain way to learn from how a user uses the platform and how it can continue to provide a good experience for them so they stay on the platform longer. However, in doing this, companies take in information that are not put into privacy agreements, such as search histories, interactions, people you follow, and more. With this data being mined from hundreds to thousands of users it is a grand amount of information that is considered private or personal to some users and could potentially be used to link back to you. According to Buchanan, in privacy agreements some users will agree for their information to be used for marketing purposes such as advertisements that better relate to them like their area or likes, but that same person could not want their data to be used in intelligence gathering which is what algorithms use. However, big data research doesn’t provide us a chance to consent to either use no matter the intentions, they use an overall policy that you have to agree to in order to use their platform.
If the United States were to make the GDPR a regulated policy every company had to follow it could provide a benefit to both the businesses and consumers. For the businesses it would guarantee protection safeguards as a built-in design as well as different ways the businesses can stay protected. So, businesses could still collect and analyze personal data while the privacy of the customers are still protected. For the users, while they have more protection and easier access of their own personal data and how it is processed, they have a right to know when their data has been hacked or exposed and how they can potentially get rid of it being open. GDPR also provides a right of information to be forgotten or deleted from their data processing and collecting, whenever a user decides to revoke their consent for their information to be collected and analyzed, they have additional rights and can be assured their information will be gone from the data collection. Using consequentialism, it would be the best way to make sure both the business and users are content.
To conclude the United States should adopt something similar to Europe’s privacy laws based on the General Data Protection Regulation. Although it could be looked at as something companies are already using and wouldn’t change the danger that private information is in on the cyberspace, it would be smart to have these regulations and safety precautions in place in case of inevitable future attacks. Considering that your information is a right you have no matter where you are including online, its important that companies where you give your information to everyday both public and private, be able to be protected in a way where you can access it no problem and where companies can be transparent about what they use your information for and how protected that information is. In a consequentialist view, this would end up slowing down some companies or processes on platforms, but would be better than how private information is currently being protected and used today.