Privacy Case Analysis

Google as a company needs no introduction, being a titan in the information technology industry. Pioneers of web search, Google has been at the bleeding edge of new technologies and internet features. One of these groundbreaking technologies which debuted in the early 2000s is Google Street View, an online application that allows users to view countries, cities, and neighborhoods from the street. This was accomplished via 360-degree cameras, which trawled up and down roadways. Google Street View drew ire globally for its accuracy and detail. For many, however, the application was a blatant violation of privacy. Vaidhyanathan, a critic of the feature, wrote in “The Googlization of Everything, (And Why We Should Worry)” that Google Street View allows for ‘default max exposure’ for an individual’s life. In the article, he argues that Google publishes an individual’s abode and various private interactions to a global audience. Google argues that they blur and obfuscate personal information on request. In the chapter, he contends that this isn’t enough and that individuals may be unaware of the exposure. Privacy and private spaces have been shrinking in recent years, I agree with the sentiment that Streetview further invades this ever-dwindling privacy. Using works from ethics writers such as Floridi and Grimmelmann, this Analysis will cover why Google Street View breaches the privacy of the individual and how privacy can be restored to the individual. In this case analysis I will argue that Utilitarianism shows us that Google should have provided an ‘opt-in’ methodology for Google Street View. As a giant in the information technology industry, Google must also be aware of the information security of the individual.

            Luciano Floridi, an Italian ethics professor, warns of technologies encroaching upon one of our most dearest possessions, privacy. Floridi’s article describes how as technology has increased in sophistication and speed, our ability to maintain our nominal levels of privacy has been on the decline. Continuing on this point, he elaborates that the commodification of information technology for the general public in the seventies and eighties hasn’t helped the situation. One of the largest concepts Floridi proposes is informational friction, and its killer, Information and Communications Technologies. (ICTs). ICTs are advances in technology, allowing the ease of information gathering, storage, and transfer. Examples of these technologies include the telegraph and TV. The digital era has accelerated the efficacy of one of the newest ICTs, the Internet. With this new ICT, many of the traditional hurdles that informational friction provides have been removed. Informational friction, as Floridi describes, are the forces and hurdles that hamper the collection and flow of information from point to point. This can include distance, the ability to obfuscate, and other methods or techniques to keep information from becoming public and dispersed. With the dawn of the internet, however, Floridi argues that the friction has been almost entirely removed.

            Google Street View (GSV), in my opinion, is the mother of all ICTs. Regarding the privacy of homes and property, informational friction is completely removed by GSV and its global reach. The friction, usually in the form of having to locate the desired property and travel to the location has been replaced by the ability to quickly log in to Google, search for an address, and begin reconnaissance. Addresses are instantly available to anyone, anywhere. To reiterate, there is zero friction. Privacy in a sense has been thrown out the window. To rectify this, I believe in not only blurring license plates and faces but also blurring houses unless specifically given permission. This has been done previously to famous individuals, such as politicians and actors after being doxed by a malicious actor. The ability to blur properties has been tried and documented, being practicable with advances in machine learning, and would protect more people. Why grant privacy to a special, wealthy few? Utilitarianism preaches that the outcome that benefits the most people should be chosen. Using utilitarian thinking the default blurring option would be the best decision for most people. More people’s privacy would be protected, and at the very least would inconvenience a small minority. The default of blurring would control and contain the ICT, and wouldn’t affect the function of the application, as addresses could still be entered, and the map function would be relatively intact. It would function almost as a traditional map, with houses staying blurred, but only streets able to be viewed and used unless homeowners have specific permission from Google. There would be less incentive to peruse and peer at people’s houses at random whilst the application could still be able to provide directions to an address. This change would benefit the most people.

            Grimmelman builds on the previous point of privacy. In the Widener Law Journal, he details how privacy should be treated the same as product safety. Meaning, as there are consequences for physical product failures, there should be similar protections and consequences for digital privacy failures. Safety features and protections, as he describes, should be necessary and meet the expectations of consumers. In the journal, he also suggests including new legal frameworks to handle violations of set privacy laws, expectations, and restrictions. Traditionally, goods like vehicles and aircraft include must-have features that protect the user, such as airbags and seatbelts. If these protective measures fail or are missing from the final product, there are usually consequences, done through the legal system. Grimmelman argues this principle digitally. For example, when a direct message is sent, there should be safety features in place to ensure non-repudiation and safe transmission, being able to only be viewed by the receiver, not prying eyes. If these mechanisms were to fail, there should be legal frameworks and recourse to handle cyber violations. He notes that this would be popular, as the majority of social media users, despite popular opinion, do care about their privacy and have been shown to actively try to keep postings contained to local, trusted audiences. Continuing this point, Grimmelman explains that other cultures, such as Japan and Europe, not only care about privacy but have specific laws and regulations protecting online privacy and digital spaces, just as they do for public spaces. Accountability and liability, as Grimmelman argues, should be the mindset, not a feature of digital applications and services, especially in the information era, with increasing reliance upon cyberspace.

            Google Street View treats people’s privacy as a product. For individuals, there is very little agency in the application, with obscure safety features. As described earlier, other cultures take privacy seriously, such as Japan. Japan particularly took issuance to Google Street View, as homeowners consider the roadway outside their homes as an extension of their property. This is the case for other cultures in Europe as well, but particularly, Grimmelman mentions how Japanese legal teams fought against Google Street View, successfully imposing new, more privacy-based requirements on the application. For many, it wasn’t just the photography of their homes, but of their daily lives. Though blurred, figures could be made out on streets going about their daily lives, cataloging, and archiving places individuals have traveled. At best, this led to embarrassing situations, at worst, inferences could be made to someone’s daily habits, being associated with unscrupulous behavior.  Either way, private lives were intruded upon. When working on the program, Americans at Google did not take the diverse understandings and definitions of privacy into consideration, nor the privacy boundaries set by cultures globally. Google’s treatment of the situation stands directly against utilitarian thinking. Instead of finding the best way of implementing a privacy solution, they chose a loose, obscure blurring framework. In a sense, they chose to impose ultimate clarity.  Google’s failure to ensure product safety caused irreparable harm to people’s personal lives. Displaying their homes to a global audience harmed large swaths of people, rather than helping them. An ‘opt-in’ feature would’ve gone more smoothly for Google, causing the least amount of uproar, and protecting the most amount of people in regard to privacy at home.

            To conclude, I would like to clarify I don’t believe there was ill will or malice when designing Google Street View. Technology in the past few decades has been rapidly evolving, with companies vying for the next technological breakthrough. With these enhanced digital “arms races”, there is usually a “build first, ask questions later” approach. As previously stated, Utilitarianism as a principle and concept aims for the greatest amount of good for the greatest amount of people. Using Floridi and Grimmelman’s articles, hopefully, this case analysis forwarded the idea that the opt-in feature would be extremely successful and assuage the public’s fear of privacy breaches at their domicile. One could argue that there are benefits of Google Street View, which undoubtedly there are some. Using utilitarian thinking, however, the global privacy infractions that occur as of now indicate that an opt-in feature would not only be more widely accepted but far more beneficial for the masses. For consumers, internet users, and unwitting individuals, their property and homes would be safeguarded against prying eyes. A giant in the tech industry, Google could implement an opt-in solution, which would balance efficiency and privacy.  

References

Vaidhyanathan, S. (2012). The Googlization of Us. In The googlization of everything: (and why we should worry) (pp. 98–107). essay, University of California Press.

Floridi, L. (2016). “Privacy.” In The 4th revolution: How the infosphere is reshaping human reality (pp. 1–18). essay, Oxford University Press.

Grimmelmann, J. (2018). Privacy as product safety. Widener Law Journal, 19, 793–827. https://doi.org/10.31228/osf.io/pkcvd