“Bugs in Our Pockets: The Risk of Client-Side Scanning” by Harold Abelson, et. al, speaks to the importance of having user information encrypted and kept out of the wrong hands. This article focuses on the way that client-side scanning (CSS) – reviewing the content on users’ devices to detect potential harm or illegal material before it is encrypted or after it is decrypted – can impact user privacy and encryption. The research questions that the article poses are 1) whether or not CSS is a good solution for balancing what is needed from law enforcement and an individual’s privacy, and 2) individual’s privacy and security working together when talking about keeping our messages and information private via encryption.
This work covers several different social science principles, such as privacy rights, ethical consideration, and technology-society interaction. Privacy rights refers to the individual’s right to keep certain information about themselves private, and choosing what to share with others and how it can be used. CSS systems may conflict with privacy rights because they involve scanning and analyzing the user’s contents without explicit consent. Ethical consideration emphasizes the importance of what is right and fair when making decisions, especially when it comes to creating new technologies. This is important to CSS systems because it requires them to decide if the systems might treat people unfairly, censor them, or even invade their privacy. Technology-society interaction is considering how technology and society affect one another. One example of this is thinking about how new inventions change the way individuals live and behave; another example is to investigate how society influences the development of technology. CSS systems affect how individuals interact online, what they expect in terms of privacy, and who has power over their privacy. It is important for CSS systems to add these social science principles to their technology, especially when it comes to users’ information and privacy.
Perpetual hashing and machine learning are the two research methods used in this study. Perpetual hashing is “a technique that uses digital image processing to create a unique fingerprint or codes for an image. The fingerprint then is generated by a special algorithm that analyzes the characteristics of the given image such as the color, texture, and shape” (Abelson, H). Perpetual hashing is crucial for CSS because it allows CSS to identify harmful content without compromising user privacy. Machine learning, the other method used in this study, focuses on developing algorithms and techniques that enable computers to learn from data and make predictions without being programmed for every task. These algorithms can help identify harmful content based on patterns, and it is even more promising when identifying text and videos.
The type of analysis that was done in the work was a technical and policy analysis of CSS, particularly in content examining methods and the implication of privacy and security. The analysis shows how the technical aspect is used for scanning, the flow of scanning content for both the servers and clients, and the potential security and privacy risks related with their implications. The policy aspect includes the impact on the users’ privacy, how the government could abuse and misuse CSS systems, and finally how effective CSS is at detecting the target while still ensuring the users’ trust and security.
A topic that was discussed in class that relates to this article is human factors. Considering human factors lets researchers learn how willing individuals are to trust CSS systems with their information. Users will want to know if the technology protects the individual’s information and does not abuse or misuse it. Human factors can add a new perspective for CSS systems and allow for understanding what is wanted from the individuals and not just the organizations.
The individuals that will have most issues with CSS systems are the marginalized groups who face privacy issues and discrimination, such as minorities, the LGBTQ+ individuals. Unjust surveillance can wrongly accuse these individuals of doing something wrong or illegal based on their identity. This would lead to more unfair treatment and make it harder for those individuals to feel safe online. Censorship is another issue. How these individuals share their expressions about what they like or find important with others that understand them could possibly be blocked or deleted due to the system assuming it was harmful when it was not. As they already face discrimination, censorship and exclusion could make them feel as though they cannot speak freely or be themselves.
In conclusion, the study of CSS systems contributes to society in many ways. One way is by requiring policies and regulations to look closely at how CSS systems work and what effects they may have on things like privacy, free speech, and fair treatment. Another way is to consider the technical issues it has in the security field. Finding weaknesses in CSS systems will push tech experts to come up with better and safer ways of protecting users’ information online. These can help ensure that the safety and privacy of marginalized people are kept safe from data breaches. Yet, there still needs to be more work done for the CSS system to work properly and ethically for all. Making the CSS system better will ensure a welcoming environment not only for individuals that want the safety of encryption, but also for individuals that need encryption to keep them safe from harm.
Work Cited
Abelson, Harold, et al. “Bugs in Our Pockets: The Risks of Client-Side Scanning.” OUP Academic, Oxford University Press, 27 Jan. 2024, academic.oup.com/cybersecurity/article/10/1/tyad020/7590463?searchresult=1.