Data breaches are an inevitable and a threat to companies or organizations and it is
especially a serious threat to people’s privacy and their information. Thankfully, the
European Union has created a set of rules that will allow EU citizens to be more in
control over their personal data. This set of rules is called the General Data Protection
Regulation and these rules apply to any organization that is in the EU or any
organization that offers its goods or services to EU citizens or companies. Under these
laws, organizations have to make sure that personal data is collected legally and under
strict conditions, protect it from misuse and exploitation, and also make sure to respect
the rights of the data owners or else face a penalty like a fine. The GDPR also extends
the definition of what type of data is considered personal data, like a user’s IP address
or their biometric data, to be in line with the digital age. GDPR also requires that
organizations notify consumers if their data has been hacked as soon as possible.
Consumers are also allowed to access their personal data by seeing how it is being
processed, being able to opt out of the company using their data, removing their
information from a database, and make organizations explain how they will be using
customer information. In this case analysis, I will argue that contractarianism shows us
that the United States should follow Europe’s lead in adopting similar privacy laws to
better protect user data and deal with organizations that mistreat data that they collect
from individuals.
In 2008, a group of researchers released data they had collected from US college
students who were using Facebook. Despite their efforts to hide and protect any
personal information, the data source became identifiable, putting the students’
personal information at risk. The privacy protection steps implemented in this database
project included several measures. First, university research assistants (RAs) were the
ones allowed to access the collected data, provided they had permission to use it.
Second, all identifying information was deleted or encoded after the data was
downloaded. Third, there was a delay in the release of cultural labels to ensure
students’ identities remained anonymous. Fourth, anyone wanting to access any part of
the database had to agree to a “terms and conditions for use” policy. Finally, the entire
research project had to be reviewed and approved by the University’s Committee.
While these steps might seem sufficient to protect the privacy of the subjects, Zimmer
analyzed each step and highlighted their flaws regarding privacy protection. Students
might have been comfortable with RAs viewing their Facebook accounts because they
attended the same school and were part of the same network. However, students might
not approve of outsiders with no connection to them viewing their pages or information,
which could happen once RAs upload the students’ info to the public database. The
researchers claimed they removed or encoded information that could identify someone.
However, the database still revealed unique attributes that could lead to privacy
breaches, such as having a single student from states like Delaware or Montana and
identifying students by rare ethnicities like Iranian, Malaysian, Hungarian, or Nepali. The
researchers’ decision to delay the release of unique cultural labels by three years does
not alleviate privacy concerns; it merely postpones them. These labels will eventually be
viewable, compromising privacy in the future. Although the researchers included a
“Terms and Conditions of Use” agreement to protect students’ identities and prevent
malicious use, many users likely skip through these agreements.
Even more than that it is unclear how the researchers plan to monitor or enforce these
terms. This database project poses a significant risk to the students whose information
was collected. Some students might not even be aware that their information is included
in the database. Under the General Data Protection Regulation (GDPR), the university’s
database project would have been conducted more ethically. The GDPR would require
the research team to notify students before collecting their information from Facebook
and explain how their data would be used and protected. Students would also have the
right to have their personal data removed or deleted from the database if they desired. If
the US had laws similar to the GDPR, the privacy risks faced by these university
students would be significantly reduced.
Contractarianism, as a moral theory, emphasizes the importance of agreements or
contracts among individuals. In the context of Zimmer’s findings, contractarianism would
argue that the researchers had a moral obligation to enter into a fair contract with the
students, which would include informed consent, fair terms and conditions, protection of
privacy and accountability and recourse. Zimmer’s critique of the Facebook data release
underscores significant ethical shortcomings, particularly in terms of respecting
autonomy, ensuring beneficence and non-maleficence, promoting justice, and
maintaining transparency and accountability.
From a contractarianism perspective, the researchers’ actions in conducting their
database project could be seen as immoral because they put many students’ privacy in
danger. The GDPR’s guidelines would help students be more aware of and protected in
the use of their data for such projects. It is crucial to set up laws similar to Europe’s
GDPR in the US, as this would provide better protection for individuals’ privacy in
various contexts, not just academic research. Ethical research practices must prioritize
the rights and well-being of participants, ensuring that all actions are conducted with
integrity and fairness. Zimmer’s findings highlight the need for robust privacy protection
measures and the importance of ethical principles in guiding research practices.
Buchanan’s study focuses on a big data research model designed to analyze Twitter
users and identify accounts that support ISIS/ISIL. The researchers behind this model
examined 119,156 Twitter accounts to create a method for detecting individuals with
ties to the terrorist group. This methodology aims to provide intelligence agencies,
researchers, and other relevant parties with the tools to investigate ISIS supporters.
While the study’s context and methods are well-defined and reliable, Buchanan raises
significant ethical concerns regarding the research methodology. Buchanan questions
the potential applications and users of this methodology, along with who has access to
and can manipulate the data. He speculates on the implications if the focus shifted from
identifying ISIS supporters to targeting groups like Black Lives Matter supporters. This
highlights the broader issue that such big data methodologies and algorithmic
processes can be used to identify various groups, including frequent McDonald’s
customers, Target shoppers, or political protesters. The researchers involved in this
project acknowledge that big data research can serve both marketing and intelligence
purposes. This dual-purpose nature means individuals might agree to their data being
used for marketing but object to its use for intelligence gathering. In spite of that, big
data research typically does not offer a choice, as it inherently reveals information about
individuals and their networks without their explicit consent.
Buchanan emphasizes the ethical responsibility of researchers to treat participants with
respect and protect their well-being. The General Data Protection Regulation (GDPR)
supports these ethical considerations by requiring researchers to inform participants
about the research objectives and obtain their consent before collecting data. Under the
GDPR, the participants in the Twitter big data project would have been informed and
consented to their data being analyzed and stored. Moreover, the GDPR mandates the
protection of personal data from exploitation or misuse. Organizations collecting data
must ensure its security, and in case of a data breach, they are required to inform the
affected individuals within 72 hours (about 3 days). Non-compliance with these
guidelines results in hefty fines.
Adopting similar regulations in the US, Buchanan argues, would be the morally correct
approach based on contractarianism, which emphasizes mutual agreement and fairness
in ethical decision-making. The current methods of data collection in big data projects
are seen as questionable or immoral because they involve using individuals’ information
without their knowledge or consent and without guarantees of data security.
Implementing laws closely related to the GDPR in the US would grant citizens greater
control over their private information. It would ensure that individuals are aware of who
has access to their data, how it will be used, and whether it is secure or compromised.
Buchanan’s exploration of big data research, particularly his study on identifying
ISIS/ISIL supporters through Twitter, brings to light significant ethical concerns that are
deeply rooted in contractarianism. Contractarianism, especially as articulated by
theorists like James Buchanan, emphasizes mutual agreements and fairness in ethical
decision-making. Buchanan’s study underscores the potential misuse of big data
methodologies and the necessity of safeguarding individuals’ rights and privacy.
Buchanan’s primary ethical concerns revolve around potential for misuse, lack of
consent and dual-purpose nature.
To sum it up, Buchanan’s study exposes the ethical dilemmas inherent in big data
research, particularly regarding consent and data security. By aligning with
contractarian principles, adopting GDPR-like regulations in the U.S. would ensure that
big data research respects individual autonomy, secures mutual agreement, and
upholds fairness. These measures would mitigate privacy concerns and prevent the
misuse of personal information, thereby promoting ethical research practices in the
digital age. While the methodology can be effective in identifying ISIS supporters, its
broader implications raise concerns about privacy and misuse of personal information.
Adopting GDPR-like regulations in the US could address these concerns by providing
stronger protections for individuals’ data and ensuring ethical research practices.
In conclusion, protecting user data is crucial as it equates to safeguarding an
individual’s right to privacy. However, current US laws make it challenging for
individuals to shield their data from organizations that seek to collect and store vast
amounts of information for various purposes. Many US organizations either lack a
commitment to protecting personal information or struggle with proper data security and
accountability. To enhance user data protection, the US should adopt laws similar to the
GDPR. Such legislation would empower individuals to be informed about how their data
is used, decide whether they consent to data collection, and ensure that their data is
more securely protected. Implementing GDPR-like regulations would not only grant
individuals greater control over their private information but also foster transparency and
trust between consumers and organizations. Moreover, these regulations could
establish stringent penalties for non-compliance, incentivizing organizations to prioritize
data protection and invest in robust security measures. Training employees on data
protection best practices and implementing regular audits would further bolster
accountability. Ultimately, adopting comprehensive data protection laws would
contribute to a more ethical digital ecosystem, where individuals can engage with
technology without fear of exploitation. This shift is essential for maintaining public
confidence in digital services and fostering innovation while ensuring that user rights are
upheld.