Case Analysis on User Data

According to the ethics of care tool, the United States should not adopt something like Europe’s new Privacy Laws, it would have a devastating effect on the economy and the online experience for consumers. According to Consumer’s Union Director of Consumer Privacy and Technology Policy Justin Brookman, “Internet policy groups understand that the United States requires some form of online privacy legislation.” He proposed that this problem be fixed by requiring that firms create clear privacy policies for their users. To ensure that consumers are comfortable with the destination of their data, he advocated for more forthcoming privacy laws. Even though the internet has been around for 25 years, the United States has yet to enact legislation mandating that private corporations provide adequate safeguards for user data. This is significant since American corporations dominate the western world’s major technological markets. U.S. lawmakers expect to reach a compromise on a comprehensive privacy policy by 2020. The ethics of care tool suggests that data-protection regulation isn’t your only choice. Instead, Congress may investigate the monopolistic power of large technological firms. It might analyze the massive imbalance of power between these businesses and their clients. And it could utilize privacy legislation to establish a larger vision of human welfare that proactively reacts to the difficulties of the digital age, though realistically this final option is unlikely to wind up on the statute books. Zimmer states that it is not uncommon for Americans to think that their online activity is being monitored because it’s in their best interest or because it’s the price they have to pay for free or reduced goods, rather than the other way around, as is more common in Europe. Buchanan states regardless of the motivation, the use of big data in research may not give us the option to give our approval for either application. The ethics of care tool suggests that the US should not adapt to the EU’s laws and regulations because they can damage the business and economy of the US, but companies should develop regulations where they should ask people before accessing their data and information.

When it comes to research projects that use data gleaned from social media, Zimmer suggests the nature of consent, identifying and respecting social network users’ privacy expectations, strategies for anonymizing data before public release, and the relative expertise of institutional review boards when faced with research projects based on data gathered from social media are some of the ethical concerns that must be addressed before future research in social networking sites can begin (Zimmer 2010). Zimmer states that it is not uncommon for Americans to think that their online activity is being monitored because it’s in their best interest or because it’s the price they have to pay for free or reduced goods, rather than the other way around, as is more common in Europe (Zimmer 2010). Only half of Americans disapproved of government activities after Edward Snowden exposed how much the government’s power to spy on its citizens had been substantially extended after Congress enacted the Patriot Act law.

According to Zimmer, large-scale collection and storage of large volumes of personally identifiable information raises the risk of privacy infringement (Zimmer 2010). There may be a breach of privacy if personal information is easily accessible to others who are not authorized to view it. The potential for subsequent, unintended use of personal information gathered from persons is known as unauthorized secondary use of personal information (Zimmer 2010). He states it is not possible to predict all the ways in which people’s Facebook profile information can be misused after its data has been collected, aggregated, and made available for download. Finally, privacy issues emerge owing to the effect of probable inaccuracies within datasets, which has led to various regulations guaranteeing individuals are provided the right to examine and modify data gathered about them to reduce any potential privacy violations. Because people have not taken into account the fact that some people have firm beliefs that what they share on Facebook stays on Facebook, which is a very big concern (Zimmer 2010).

Although Europe has certain admirable rules, such as the General Data Protection Regulation (GDPR), to regulate the use of personal information by big firms and organizations online, these regulations do nothing to protect the digital privacy rights of Americans. Software and IT service providers operating in the United States are obligated by law to grant law enforcement access to all customer data, regardless of where it is physically housed. It protects U.S. service providers from having to inform customers if authorities have demanded their information. Even if European corporations raise objections to a data transfer, it would likely still be received by the United States government, which will then be free to use the information however it sees fit. The EU and the United States have fundamentally different institutional setups. Brussels serves as the EU’s epicenter, where leaders from all member states gather to discuss and strategize on how to move the union forward. Silicon Valley is the economic heart of the United States. Considering the wealth of private information that major internet firms may make available to U.S. government intelligence agencies, the existence of a “long history of close cooperation” between the two sets of organizations is not unexpected. Because information may also equal power, I believe many large tech businesses have a significant monetary incentive in ensuring that any internet privacy legislation is inadequate and does not constrain their business models too much.

Significant conceptual gaps exist in the comprehension of the privacy implications of research in social networking environments (Zimmer 2010). Therefore, the subjects’ confidentiality is still at risk. There are obstacles to the conventional nature of consent, to the accurate identification and respect of privacy expectations on social networking sites, to the development of sufficient strategies for data anonymization before the public release of personal data, and to the comparative expertise of institutional review boards when faced with research projects. Facebook”, researchers of the future will need to learn more about the importance of context when it comes to protecting users’ personal information. It is important to understand the norms of information flow in different contexts, and this knowledge should feed our investigation into what constitutes “permission” when disclosing personal information in online social networking settings.

Research shows that the GDPR hinders business owners and that positive data collection is not beneficial to customers. The GDPR’s fines are too high, especially for startups that haven’t had time to thoroughly study the new rules. The EU’s GDPR does not need to be translated into American law. While other countries have passed laws to safeguard their residents’ privacy, the United States needs similar measures to catch up. To curb further abuse of technology, authorities will be able to enact privacy legislation that upholds fundamental principles like consent. Sooner rather than later, privacy regulations need to be enacted so that future innovators may adjust their products to meet the required requirements without having to spend extra money. In this way, the ethics of care tool suggests that government privacy measures are necessary to win back consumers’ trust. Zimmer’s concept speculates that firms could be discouraged from engaging in invasive data tracking if privacy restrictions were in place (Zimmer 2010). Any new federal privacy legislation should be enforced with a greater commitment of money and personnel.

Elizabeth Buchanan’s concept shows that the use of social media to track down particular persons inside networks or organizations shouldn’t come as a shock given the exponential growth of these platforms and the ease with which people may access them throughout the world (Buchanan 2017). Buchanan states that Large-scale data extraction and analytics have been resisted by ethicists and privacy advocates in the name of national intelligence and security, but as data have become so accessible—provided by the users themselves—the battle to protect individual liberties appears more and more daunting. Ethical dilemmas relating to privacy, rights, and autonomy, as well as social justice issues like discrimination, have arisen in tandem with the rise of mass data mining across social media and the Internet (Buchanan 2017). The ethics of big data, also known as “real-world data,” have recently been the subject of numerous conferences and scholarly publications in both the United States and the European Union. According to Buchanan, big data analytics and their applications to questions of individual and societal privacy will present difficulties for researchers even before the EU General Data Protection Regulation takes effect in 2018.

Buchanan states that the field of big data science has expanded into nearly every other industry, including those of education, epidemiology, and law enforcement (Buchanan 2017). Research with big data in general, of which Twitter is just one source, is expected to grow and push boundaries on traditional research methods and ethics principles in light of Twitter’s massive number of active accounts and active users, as well as its openness for researchers to explore and exploit its data. Western models of research ethics, which place value on the individual and guarantee the individual’s autonomy through informed consent, do not easily accommodate the study of social network data or big data. Researchers are obligated to ensure the safety and well-being of their subjects and must treat them with dignity and respect at all times. The accounts used to mine data are open to the public, and the ultimate goal of identifying those susceptible to online extremism is a socially beneficial and commendable one.

According to Buchanan, the intelligence community gains access to one’s personal data that was originally collected for marketing purposes (Buchanan 2017). The impact of technological advancements, however, is now explicitly recognized in federal regulations (Buchanan 2017). All of one’s connections in one’s social media landscape matter, no matter how remote or impersonal they may be. The philosophical and practical questions are countless It also shouldn’t come as a surprise that government agencies like the police and intelligence services are working hard to improve their knowledge of cutting-edge technologies like social media and big data to detect and disrupt communications for law and security reasons (Buchanan 2017). In the present US regulatory paradigm of the Common Rule, risk and benefit are determined by the extent to which a researcher has access to personally identifiable information or may discover information about the subject through methods such as reidentification. Second, policymakers shouldn’t make privacy legislation as sweeping as the General Data Protection Regulation (GDPR). The compliance costs associated with the General Data Protection Regulation (GDPR) are too expensive for the firms that will be affected by it. Authorities should inquire about consumers as to their willingness to pay the projected GDPR-induced price increases for the benefit of personal data security. Most consumers, the study found, are not ready to fork over more cash to maintain their anonymity online. The opt-in requirement of the GDPR is overly restrictive for both consumers and corporations, therefore regulators should examine other privacy rules with a similar provision. The ethics of care tool suggests that there should be no needless restrictions on businesses or innovators.

Due to the inevitable sweeping effects of every given piece of law, it’s best to have a comprehensive regulatory structure in place. Considering the current state of privacy protections in the United States, there is no pressing need to carefully craft any future privacy law. Authorities should, however, craft privacy policies to preclude the global applicability of California privacy legislation or the GDPR. The authorities should consider the reality that European and American values are distinct from one another. The Facebook-Cambridge incident highlights the need for stricter rules to prevent such incidents in the future. The United States ought not to blindly implement the GDPR. But the right way has not been found yet.

While the General Data Protection Regulation (GDPR) and the plans from the US states are not identical, they share the goal of ensuring data subjects have more say over how their information is used. However, any GDPR version that passes Congress is also likely to be severely scaled down and look more like the present US form of notification. There’s no denying the benefits of data protection regulations modeled after Europe’s, but they won’t be enough. Although the GDPR treats data processing as if it were always for a good cause, it is possible for even properly processed data to be used for oppression and abuse. Data privacy regulations may be myopic if they fail to account for how the data-hungry business sector is wreaking havoc on society at large, including on our planet, our institutions of government, our capacity to focus, and our mental and emotional well-being. Ethics of care tool suggests that data protection on the level of the General Data Protection Regulation (GDPR) would be enough, but the United States is too different from Europe to properly adopt and enforce such a system on those terms as its overall system can be disturbed and people can face losses.

Finally, the ethics of care tool shows that GDPR is not only the choice as it can affect the economy and tech companies of the US, but a privacy policy should be develop by companies to protect the data and privacy of people. Zimmer suggests researchers need to admit they do not know everything about the dynamic nature of privacy and the difficulties of anonymizing datasets, and they should work to assemble a multidisciplinary group of partners to help them avoid making the same mistakes again. Research on social networks is challenging, thus we need to assess IRBs and educate policymakers. Data privacy has evolved in the United States and Europe in quite different ways, and it’s interesting to compare and contrast these two systems. Most notably, the United States lacks universally applicable privacy legislation outside of highly regulated sectors like the health care and financial industries. Instead, it is up to the data controller to decide what to do with an individual’s information, so long as it has not been dishonest about its intentions. Due to the nature of the law, businesses may get away with writing extraordinarily lengthy terms of service, which almost all customers will agree to without reading. Buchanan states that the United States federal regulations governing the safety of human research subjects were last updated in 1991, long before the advent of the revolutionary technologies that are now disrupting the research industry. Humans may unwittingly participate in studies involving marketing, intelligence, or behavioral interventions. The complexities of this seemingly unfettered research domain, these research methods and ethics, and the increasingly diluted spaces of social media and big data are daunting. By and large, U.S. customers have decided to send data to firms with little restrictions in exchange for free or cheap access to the services such companies offer. The ethics of care tool shows that these broad terms of service are a company’s first line of defense against allegations that it has mistreated its customers, saying effectively “you gave us permission.”

Works Cited

Buchanan, Elizabeth. 2017. “Considering the ethics of big data research: A case of Twitter and ISIS/ISIL.” PLOS One 12 (12). doi:10.1371/journal.pone.0187155.

Zimmer, Michael. 2010. “But the data is already public”: On the ethics of research in Facebook.” Ethics and Information Technology 12 (4): 313-325.