Using sharing on Facebook as a case study, this Article presents empirical evidence suggesting that trust is a significant factor in individuals' willingness to share personal information on online social networks. I then make two arguments, one that explains why Facebook is designed the way it is and one that calls for legal protection against unfair manipulation of users. I argue that Facebook is built on trust: the trust that exists between friends and the trust that exists between users and the platform. In particular, I describe how Facebook designs its platform and interface to leverage the trust we have in our friends to nudge us to share. Sometimes, that helps create a dynamic social environment: knowing what our friends are doing helps us determine when it is safe to interact. Other times, Facebook leverages trust to manipulate us into sharing information with advertisers. This should give us pause. Because Facebook uses trust-based design, users may be confused about the privacy effects of their behavior. Federal and state consumer and privacy protection regulators should step in.
INTRODUCTION I. PRIVACY AND TRUST A. What Is Trust? B. Particular Social Trust and the Propensity to Disclose C. Trust, Sharing, and Privacy II. Research Design, Methodology, and Data A. Facebook B. The Survey C. Data Report 1. Background Demographics 2. Quantitative Data Analysis III. Significance of Findings A. How Facebook Exploits Trust B. Steps Forward 1. Privacy by Design 2. Federal Regulatory Responses 3. State Attorneys General Conclusion INTRODUCTION
Online social networks present a privacy puzzle. Scholars have shown that between 2005 and 2011, both total sharing on Facebook and privacy-seeking behavior on the platform increased. (1) That means that Facebook users were sharing much personal information even as they were changing their privacy settings to ostensibly make their data more secure. It seems counterintuitive: if we are concerned that Facebook does not protect our privacy, we should share less, not more. (2) This is a particularly jarring contradiction given that Facebook's voracious appetite for data is not sated by the information we actively disclose; it also sweeps in data from our clicks, third-party apps, internet browsing behavior, and our interactions with its partners and advertisers.
So how can we explain our sharing behavior? Several studies have suggested that people make their disclosure decisions based on a variety of factors, including whether others have disclosed, (3) the order of questions, (4) website design and aesthetics, (5) and social motivations, (6) to name just a few. James Grimmelmann has argued that because social networking sites are platforms for executing essential human social needs, it is Facebook's design that nudges us to disclose. (7) This Article builds on this work, arguing that Facebook encourages us to share personal information by designing its platform to cue trust among its members.
In 2013, Facebook asked its users: "How trustworthy is Facebook overall?" A spokesperson explained that Facebook was just looking for feedback to improve service and enhance user experiences. (8) But there is likely much more to it. We know that Facebook is an inherently social tool designed to create, foster, and expand social interaction. (9) We also know that Facebook routinely tinkers with its user interface to inspire user trust and, in turn, sharing. Its committee of Trust Engineers, for example, plays with wording, multiple choice options, the order of questions, designs, and other tools to encourage users to honestly report what they do not like about posts they want taken down. (10) That may be an important goal, but it shows that Facebook is well aware that trust and sharing are linked.
This Article begins where Facebook left off, seeking to fill a gap in the legal and social science literature on what motivates people to share personal information online and when regulators should step in to protect individuals from manipulation. Based on previous studies on Facebook's design and using primary empirical research of Facebook users, this Article shows that we share when we trust. In particular, it is the trust we have in others--what sociologists call particular social trust--that encourages us to share on Facebook. Higher levels of trust in the platform and higher levels of trust in those individuals in our networks are associated with a higher propensity to share personal information.
Facebook knows this and it has designed its platform to benefit from it. (11) Among many other tactics, Facebook prefaces both social posts and native advertisements with information on how one's friends and other users have interacted with the content. In doing so, it not only creates the circumstances for social interaction with those we trust, it exploits the trust we have in our friends and families for financial gain by manipulating us into sharing information with third party advertisers, as well. Given how frequently users already confuse native advertisements with other content, (12) Facebook's design strategy to leverage trust to manipulate us into clicking on those advertisements should give us pause. Regulators should step in.
This Article proceeds in four parts. Part I briefly summarizes the two literatures relevant to this study: users' propensity to share personal information online and the sociology of trust. This Part argues that, to date, trust has played an underappreciated role in our understanding of sharing. Part II defines the methodology used for the research. Part III presents the results and reports on the statistically significant correlation between trust and the willingness to disclose. The results suggest that individuals tend to share personal or sensitive information in contexts of trust, with the expectation that their privacy will be protected. Part IV describes how this research is already reflected in Facebook's News Feed and suggests that privacy lawyers and regulators have a role to play in protecting consumers from manipulation.
Privacy and Trust
There is a growing literature on the connection between privacy and trust. (13) Several scholars, including James Grimmelmann, Alessandro Acquisti, and others, have conducted theoretical and empirical studies into our motivations and sharing behavior on Facebook. (14) This Article brings these otherwise isolated literatures together and provides quantitative evidence that trust is an important factor in Facebook users' decisions to share. The Article also shows that Facebook designs its platform to take advantage of this link.
This Part describes the current state of research on trust and sharing. I address what social scientists mean by trust, hypothesize its connection to individuals' propensity to disclose, briefly summarize the current social science literature on sharing, and show that trust in other people has been an underappreciated force in that research. I then bring together the social science and legal literatures to tease out the theoretical relationship between trust, sharing, and privacy.
What Is Trust?
Much of the work on trust, (15) sharing, and privacy online focuses either on how protecting privacy can build trust (16) or on how the perception that a website can be trusted to protect user privacy can assuage the privacy risks perceived by consumers. (17) Indeed, when the Federal Trade Commission (FTC) and the California Attorney General's office recommend that online platforms be transparent about their privacy and data practices so as to inspire consumer trust, (18) they are talking about the trust consumers have that those platforms will fulfill their data privacy promises and safeguard customer data. (19) But on what bases do we learn to trust these websites? This Article contends that it has a lot to do with who else uses them. That is particularly true for online social networks.
The trust we have in specific other people is called particular social trust, or a resource of social capital between or among two or more persons concerning the expectations that others will behave according to accepted norms. (20) It is the "favourable expectation regarding other people's actions and intentions," (21) or the belief that others will behave in a predictable manner. For example, if Alice asks Brady to hold her spare set of keys, she trusts that Brady will not break in and steal from her. When an individual speaks with relative strangers in a support group like Alcoholics Anonymous, she trusts that they will not divulge her secrets. Trust, therefore, includes a willingness to accept some risk and vulnerability toward others to grease the wheels of social activity. (22) If I never trusted, my social life would be paralyzed. As Niklas Luhmann stated, trust exists where knowledge ends. (23) It is the mutual "faithfulness" on which all social interaction depends. (24) I cannot know for certain that my neighbor will not abuse her key privileges or that my fellow support group members will keep my confidences, so trust allows me to interact with and rely on them. And I earn all sorts of positive rewards as a result. (25)
Lawyers should be familiar with particular social trust. It is, after all, at the core of the general notion of confidentiality and the more specific doctrines of privilege. (26) As Neil Richards and Woodrow Hartzog have noted, "perhaps the most basic assumption people make when disclosing personal information," whether to doctors, lovers, or ISPs, "is that the recipient will be discreet." (27) They note that we trust doctors "not to reveal information about our health and mental state" and trust lovers "not to kiss and tell." (28) Richards's and Hartzog's formulation of discretion, therefore, is based on trust, or the expectation that individuals will continue to behave according to accepted social norms. We expect doctors to keep our medical...