Case Analysis on Privacy

by Cole Baty, for PHIL 355E, Fall 2022

INTRODUCTION

In “The Googlization of Us”, author Sida Vaidhyanathan describes the reactions of the public in several large markets to Google’s deployment of the Street View feature in its product Google Maps.  As the name implies, Street View allows users “to take a 360-degree view, at ground level, of streets and intersections in many cities” across the globe. Street View was rolled out in stages, appearing in different markets at different times, and Vaidhyanathan describes how the different cultures in these markets responded to the new ability “to undertake forms of surveillance of each other that have never been possible before.” There was a significant negative initial response to Street View in each of the markets, and while it could be said that these negative responses were concerned chiefly with privacy, there were different specific privacy concerns in each of the markets. In this Case Analysis, I will argue that the Confucian tool for moral reasoning shows us that Google should have incorporated more market study for each of the markets in which Street View was released before deployment, which would have revealed the need for more consideration given to the varied specific privacy sensitivities.

FLORIDI: “Privacy: Informational Friction”, The Fourth Revolution

In “Privacy: Informational Friction” from The Fourth Revolution, author Luciano Floridi describes how our relationship to privacy has changed since the transition into the Information Age, or what he calls “the fourth revolution.”  Floridi introduces the concept of thinking of humans in terms of informational organisms – inforgs – whose access to information can be described in terms of informational friction, and how there is an inverse relationship between informational friction and access to information.  That is, more informational friction in a given environment means less access to information; and less informational friction means more access to information.

Floridi argues that the privacy enjoyed by people in pre-fourth revolution times was largely tied to anonymity; that is, the informational friction in pre-fourth revolution times was high, and so access to information about other people was much more restricted.  By contrast, in post-fourth revolution times, informational friction is comparatively low: people freely volunteer personal information about themselves on any number of social media platforms, which is in turn available to other users of these platforms.  

This reduction in informational friction has had an effect on the perception of privacy between Generation X (the generation after the Baby Boomers) and Generation Y (the millennials).  Quoting a report from the Pew Internet & American Life Project on Teens, Privacy and Online Social Networks, Floridi notes that

To teens, all personal information is not created equal.  They say it is very important to understand the context of an information-sharing encounter.

It is easy to take for granted just how transformative the transition into the Information Age has been, and how rapidly society has changed.  We are at one of those unique moments in history where, for a short period of time, there exists a specific combination of three generations: people who grew up with only the Old Way, people who have only ever known The New Way, and people who grew up during the transition, who may remember the Old Way but have adopted The New Way.  In this case, The New Way is reliable access to fast Internet connections.  For example, in the Old Way, for most of human history, communicating in real time with someone on the opposite side of the planet was not possible, or else achieved at great cost.  In The New Way, I can communicate face-to-face in real time with someone anywhere on the planet with access to a reliable Internet signal for the cost of fractions of pennies.  Information also shares this same level of access – it is a trivial, inexpensive thing to transmit entire libraries’ worth of information across the Internet.

Floridi argues that the fourth revolution – the transition from the Old Way into the New Way – has had a profound transformative impact on the perception of privacy.  To describe in Floridi’s terms, before the fourth revolution, privacy was considered a more or less fixed level of informational friction for a given infosphere.  But, he notes, we already see a generation, born into the New Way, who employ a nuanced understanding of privacy, with variable levels of informational friction for a given infosphere.

This is, to me, a clear indication that Generation Y and the following generations are employing the Confucian tool for moral reasoning with regard to privacy.  Floridi is showing us here that there are already inforgs who do not see a one-size-fits-all solution to privacy, but rather that each choice made about privacy is heavily contextual and influenced by any number of overlapping factors, including the environment in which information is shared, the personal relationships between other inforgs in this environment, the risk of this information being exposed to a larger audience, etc.  This demonstrates an adaptation to the change in the infosphere brought about by the fourth revolution, and to me is a clear indication that people are “acting within their role” of a responsible inforg by making these nuanced choices about not only the type of information being shared, but also the medium (or media) in which the information is being presented.

If we consider Google as its own inforg, can we say that Google was acting within its role in the way it released Street View?  To answer this question, we should consider whether there was anything Google was required to do and did not.  As inforgs, human beings are bound by a certain set of rules and customs, usually dependent on the society in which they live.  Google, an inforg in its own right, is not necessarily bound by the same rules and customs which apply to human beings.  This is not to say, however, that there are no rules at all which govern the way Google can act as an inforg.

GRIMMELMAN: “Privacy as Product Safety”

In “Privacy as Product Safety,” James Grimmelman also takes up the theme of privacy as a highly contextual, nuanced decision made by individuals.  He contrasts the way in which users innately apply their own privacy sensibilities to their use of social media platforms such as Facebook against what he claims are the inherently flawed privacy designs of these platforms, principally Facebook.  He introduces and then disproves four myths about Facebook users and Privacy:

  1. Facebook users don’t care about privacy
  2. Facebook users make rational privacy choices
  3. Facebook users’ desire for privacy is unrealistic
  4. Database regulation will make Facebook privacy-safe

We shall consider only the first two myths for the purposes of this essay.

For the first claim, Grimmelman points out that this myth “does not fit the available data,” citing “massive user protests” when Facebook released new features such as the News Feed, its Beacon advertising system, and its controversial data-retention policy, all features that relied heavily on aggregating specific information about all users.  To understand why there would be misgivings about the scope of information displayed in the News Feed, consider that in the early days of Facebook, if you were interested in keeping up with the activities of users you had “friended,” you had to specifically seek out this information by visiting that user’s individual profile page.  After the introduction of the News Feed, “when Facebook users find out that others are looking at their Facebook profiles, such as employers, relatives, or police, they…object. These are the protests for whom privacy matters.”  Grimmelman points out behavioral adaptations affected in specific populations by these changes to the Facebook user interface, remarking particularly on the practice of college students returning from a night of partying untagging themselves in photos posted from the previous evening’s activities which might not show them in the most flattering light.  He writes,

The point is not that these “Digital Native” prize privacy above all else or that they experience privacy in the same way previous generations did or that the social content of privacy is stable.  The privacy they care about is social and relational, perhaps less concerned with databases and governmental surveillance than their parents’ and grandparents’ privacy.  They are constantly trading their privacy off against other social opportunities and making pragmatic judgment calls about what to reveal and what to keep hidden.

Here again we see a younger generation, adapted to the specific informational hazards posed by the environment with substantially less informational friction than experienced by previous generations.  We see this generation acting in accordance with Confucianism by recognizing that not one solution applies to all situations; that decisions about privacy require nuanced, highly contextual decision making.  We see individuals themselves having agency over their own privacy sensibilities.

For the second myth, Grimmelman demonstrates that “users massively misunderstand Facebook’s privacy architecture and settings,” citing as example the fact that membership in a “group” exposes a user’s information to all other members of that group by default, and by extension any secondary, tertiary, etc. groups to which members of the primary group may also belong.  This is counter to the intuitive thought that anything shared in a so-called “private” group would be restricted to that original audience.  This default group setting was eventually eliminated by Facebook, “having presumably concluded that users were never going to understand how networks worked.”

Grimmelman produces similar arguments to disprove the remaining myths, and eventually settles on the assertion that privacy should be considered a commodity, and therefore subject to the same consumer protection regulations that apply to, for example, automobile manufacturers.  He argues that it’s a question of liability, and aligning privacy with commodity regulation would force business models which trade in privacy to put some “skin in the game” by assuming liability for damages done to people’s privacy.  Consider an automobile manufacturer who has released a vehicle to the market which is later found to have a serious safety flaw.  The automaker is bound by consumer protection laws to issue a recall and rectify the flaw at no cost to the consumer.  “The first point implicit in the basic duty of sellers to make their products safe is that sellers can be held liable even when the consumer is at fault in the accident”, he writes.

Here, by proxy, Grimmelman is demonstrating the already well-established roles and duties which have come to be expected from entities offering a service to the general public.  Generally, we expect products we consume to not cause harm to ourselves.  This is in alignment with the Confucian principle of entities behaving in accordance with their role.

CONCLUSION

Returning now to Google, we see that this inforg was not acting in “the right way” in the Confucian sense by not accounting for specific cultural sensitivities to privacy in each of the markets in which they released Street View.  I believe that they should have taken care to research more into how their users would react to such a pervasive intrusion into privacy.  Vaidhyanathan describes an open letter addressed to Google by search-engine professional Osamu Higuchi.  In this letter, Higuchi describes the difference between American and Japanese interpretations of the boundary between public and private spaces.  In the US, claims Higuchi, these privacy boundaries more or less align with property boundaries; private property ends at the property line, and public property (the street) begins on the other side of that line.  In Japan, however, Higuchi explains that “[t]he residential street in front of a house…feels more like a part of one’s own living space, like part of the yard.”  Someone “peeping” over a fence or a hedge at the residents of the house is considered rude and antisocial.  The Street View images were obtained by cars with special cameras driven down every street in a given city, mounted at or above average eye level.  Higuchi cited the “asymmetry of the gaze” as the underlying privacy concern.  As Vaidhyanathan puts it,

A person walking down the street peering into residents’ yards would be watched right back by offended residents, who would consider calling the police to report such dangerous and antisocial behavior.  But with Google Street View, the residents can’t see or know who is peeping.

After the concerns in Higuchi’s letter were taken up by many others in Japan, explains Vaidhyanathan, Google agreed to re-shoot the images by mounting their cameras lower on the cars used to obtain these photos.  I think this act of remediation is in line with Confucian moral reasoning – one party suffered an offense and notified the offending party, who made good-faith efforts to repair the relationship.  But if this could be done after the fact, it could just has easily have been done before the fact by asking the first party whether they had any concerns about the implementation and deployment of Street View.