Case Analysis on User Data

In this analysis I will be discussing an article written by Danny Palmer. In his article he discusses the General Data Protection Regulations (GDPR) which became effective in the European Union on May 25th, 2018. In short, the GDPR protects the privacy rights of individuals from businesses and organizations. It also puts forth specific requirements for how these entities shall protect, process, and use the information obtained and puts forth requirements for disclosure of breaches and security requirements. The information that Palmer first presents is a top-down overview of GDPR and lowers into the specific courses of action companies and organizations must take to secure data of individuals in the EU. In this Case Analysis I will argue that Ubuntu shows us that the United States should follow Europe’s lead and create our own version of GDPR. That we have a shared responsibility to follow in the footsteps of the EU. This provides for collaborative approach to privacy. Where individuals and organizations come together with a shared vision of privacy, and everybody shares the responsibility to protect individuals from harm.

In Zimmer’s paper he identified privacy deficiencies with the protection of the student’s data by the members of the “Tastes, Ties, and Time (T3)” project. The goal of the T3 project was to use social media to obtain a more robust dataset to understand the nature and dynamics of social networks, in this case, Facebook. Without the consent of the students, but with the consent of the university and Facebook, the T3 researchers downloaded the profiles and network data of freshmen students and monitored and updated that data over a four-year period. The authors of the project thought that they had come up with ways to make identifying the students extremely difficult, not knowing how to properly secure the information. Identifiable data was able to be determined using course majors, location of students, hometowns, social connections, and cultural tastes. Using local research assistants, the researchers were likely able to obtain information that students believed to be only for social networking on Facebook
and within the college network, and not for an expansive research project. There was no consent from the individuals in this study and in my view the researcher’s attitudes towards the student’s privacy was based on the protection of the research and not on the individual’s rights to privacy.

Ubuntu ethics calls for a more comprehensive approach that involves all in the community. A shared responsibility towards privacy protection. The T3 researchers believed that simply removing student names and identification numbers, a delayed release of cultural interests, a review board and a “terms and conditions” for the use of information, was enough to protect the privacy of the students. In the US we identify personal identifiable information (PII) as a very basic set of data such as name, social security number, driver’s license number or credit card. In the EU for example, the definition of PII is more comprehensive. In addition to what the US has the EU includes that of “Any” information related to the identity of an individual directly or indirectly. This includes such information like physical, physiological, mental, economic, cultural, or social identity. If the T3 researchers followed the EU’s example there would have been a higher standard of protection and less identifiable information would have been released. The fact that the T3 researchers recognized that there was a potential for student identification and put their own project before what was collectively good for the students, goes against the Ubuntu ethics idea. Having a similar regulation like the GDPR in the US would have provided a framework for the research team to take better care in the way they gathered, stored, and distributed the student’s data. In the case of the T3 project, it was more of an afterthought than a proper roadmap for preserving privacy. A proactive approach with rules and regulations to follow, a shared responsibility in the safety and quality of the research and the subjects in which it affected is more likely to promote the privacy and well-being of everyone involved in the project.

In Buchanan’s paper, she talks about the detecting of terrorist networks such as ISIS/ISOL using Twitter or X as a dataset. She talks about the Iterative Vertex Clustering and Classification (IVCC) model to identify these groups. The IVCC model uses information such as followers, likes, hashtag references, comments, and mentions to identify individuals or groups that may have terrorist ties. This seems dangerous to me; it makes me think about the stories during World War II of how suspected communist sympathizers were treated. Increased surveillance, persecution, and other forms of unofficial punishment. For a suspected belief, not having gone through any due process. If you have used X like I have, this type of data can be easily manipulated and could ultimately cause harm to individuals. X, like Facebook has fake accounts. Accounts that mimic others for a verity of reasons. The data on one person’s account that does not exhibit terrorist sympathy or ties can be overshadowed by ten fake accounts that do. This could get you a classification as a terrorist or terrorist sympathizer when that’s not true at all. There are also a lot of “trolls” on social media. You could live your physical life a certain way and your online life a completely different way. You could bait people into emotional responses to provoke them when you don’t really believe what you are saying, you are just creating drama. Now the IVCC model picks up on your or the fake account and classifies you a certain way, likely being a false classification. Because of your views on social media, correct or incorrect, you could face punishment in many forms. Like the suspected communist sympathizers, you could be blacklisted from certain individual rights, detained without trial for “National Security,” put on a no-fly list or some other list that the government has access to. All with the presumption of guilt not the presumption of innocence that we all have a right to. Again, Ubuntu ethics emphasizes trust and respect for the individual as they relate to the well-being of the community as a whole. The IVCC model when used as a government surveillance tool undermines that trust and erodes the community. This goes against the Ubuntu ethics goals. If the US were to adopt a similar set of regulations like the GDPR, this could go a long way in instilling trust and dignity of an individual’s privacy and could limit what kind of data is considered “public” on social media platforms like X and Facebook. Especially if the US implemented these protections with Ubuntu ethics in mind. What is good for the collective is good for the individual and vice versa. We move forward as a collective of individuals and organizations all with the common goal of privacy protection and limiting harm to the individual, thus preserving the community. All sharing in the responsibility of care towards the community. Government surveillance can undermine these concepts and prevent the collaborative effort of privacy protections between individuals and organizations.

We seem to be at a point in history where there is very little trust in government and organizations. People are very divided on how these institutions should be run. I think we should move in the direction of the EU and author our own set of rules and regulations like those in the GDPR. But not just implementing the rules. Do it with Ubuntu ethics at the forefront of the process. Make us all stakeholders in the protection of our privacy. Make us all share in that responsibility. Moving forward with great care and respect for the individual and fostering an environment that respects individual rights while also upholding the safety and security of the community. Care should be taken regarding transparency and trust in the process of creating these privacy regulations. Not to just have them in the books but to follow them willingly and positively, with a genuine and consistent regard for the privacy of all.