Technologies that mediate social interaction can put our privacy and our safety at risk. Harassment, intimate partner violence and surveillance, data insecurity, and revenge porn are just a few of the harms that bedevil technosocial spaces and their users, particularly users from marginalized communities. This Article seeks to identify the building blocks of safe social spaces, or environments in which individuals can share personal information at low risk of privacy threats. Relying on analogies to offline social spaces--Alcoholics Anonymous meetings, teams of coworkers, and attorney-client relationships--this Article argues that if a social space is defined as an environment characterized by disclosure, then a safe social space is one in which disclosure norms are counterbalanced by equally as powerful norms of trust that are both endogenously designed in and backed exogenously by law. Case studies of online social networks and social robots are used to show how both the design and law governing technosocial spaces today not only do not support trust, but actively undermine user safety by eroding trust and limiting the law's regulatory power. The Article concludes with both design and law reform proposals to better build and protect trust and safe social spaces.
TABLE OF CONTENTS INTRODUCTION I. TRUST AND SOCIAL GOVERNANCE A. What is Trust? B. How Does Trust Develop? II. SAFE SOCIAL SPACES A. Alcoholics and Narcotics Anonymous B. Teams of Coworkers C. Attorney-Client Relationships III. THE PROBLEM OF TECHNOSOCIAL SPACES A. Disclosure and Other Risks B. Organic Trust in Technosocial Spaces C. Manipulative Designs that Entice Disclosure D. Legal and Regulatory Void IV. PROPOSED CHANGES TO DESIGN AND LAW A. Designing for Trust and Safety B. Law to Support Trust and Safety 1. Information Fiduciaries 2. Empowering the FTC 3. Privacy by Design 4. Reform to Section 230 CONCLUSION INTRODUCTION
Our social interactions are mediated by technology. We chat with friends, read the news, and buy things on technosocial (1) platforms run by the likes of Facebook, Google, and Amazon. Alongside all of the benefits that kind of technology offers, it can also put our privacy and safety at risk. Scholars and media commentators have documented the rampant invasions of privacy, (2) gender-based harassment, (3) racism, (4) cyberstalking, (5) nonconsensual pornography, (6) and intimate surveillance (7) on digital social platforms. Prominent women and members of other marginalized groups are leaving these spaces. (8) That is not only regrettable; it is dangerous for democracy. (9) Even as scholars start to pay more attention to platform content moderation policies that ostensibly try to create safe and welcoming environments online, (10) things are not much better on the ground. This raises the question at the heart of this article: How can we make online social spaces safer?
Social spaces, as I am using the phrase, are multi-actor informationsharing environments. (11) They can be physical (chatting with a friend at a coffee shop or running into an acquaintance and her dog at the corner of First and Main), digital (texting with someone on an online social network), or telephonic; (12) they can be big (a party or a megachurch) or small (a oncon-one meeting); they can involve the exchange of words (during a conversation over dinner) or body language (at a dance party or across the room at a tiresome meeting). (13) Spaces become social when they are constructed by persons engaged in information exchange.
As such, social spaces require us to navigate our privacy. Granted, privacy and sharing are creatures of context, (14) and different social contexts function on different disclosures. (15) But all social spaces operate with disclosure norms; that is, we all must share something. (16) Because sharing information involves some risk--disclosure inherently makes one vulnerable to others--social spaces require risk minimization mechanisms if they are to survive. Otherwise, we could not continue to share secrets with our best friends, confide in loved ones, engage in commerce, or express ourselves freely. (17) We would lose our sexual privacy, (18) our opportunities for solitude, (19) and our freedom to develop and affirm our identities as we see fit. (20) Creating environments where these freedoms exist is,(1) argue, the role of trust, design, (21) and the law. (22) If social spaces are defined by information exchange, safe social spaces are environments of information exchange in which disclosure norms are counterbalanced by norms of trust backed endogenously by design and exogenously by law.
To suggest that the buildings blocks of safe social spaces are trust, design, and law, this article offers analogies. (23) Part I explores trust and its effects on social behavior. Part II then shows how three paradigmatic safe social spaces--Alcoholics Anonymous (AA) meetings, corporate teams, and attorney-client relationships--are all endogenously designed to foster the kind of trust, confidentiality, and discretion needed to facilitate disclosure. (24) And because society benefits from disclosures in each of these contexts, (25) the law exogenously supports designed-in norms of trust to ensure those spaces are safe for sharing personal, secret, or stigmatizing information.
Technosocial spaces, however, lack both endogenous and exogenous structures that support trust. Far from it. As I discuss in Part III, these spaces are actually designed to manipulate us and lull us into false senses of familiarity and confidence, thereby enticing risky disclosure. And they do so in a legal and regulatory void that leaves users unprotected and vulnerable to invasions of privacy and online harassment.
But it doesn't have to be that way. Technosocial spaces can learn from safe social spaces offline and reorient design and law to foster trust. Robust approaches to privacy- and safety-by-design can protect users from the inside, (26) and stronger legal responses to manipulative design and online harassment can restore trust when something goes wrong. These proposals are outlined in Part IV.
There is no perfectly safe space, digital or otherwise. Even better design, comprehensive federal and state laws, and private ordering cannot account for all human mischief. But in a modern world in which sharing is, if not always mandatory, expected, law and design can make social spaces safer by supporting and protecting trust and repairing it when it breaks down.
TRUST AND SOCIAL GOVERNANCE
Although scholars bring different modalities to the study of trust, there is remarkable overlap in the way different fields conceptualize the concept. That literature has been discussed in depth elsewhere. (27) A chief take away from that scholarship is that trust is an essential element of online social governance. Joel Reidenberg (28) and Lawrence Lessig (29) predicted this when they argued that law, architecture, markets, and norms work together to regulate online conduct. Trust is one of those norms and, therefore, an important focal point for the study of technosocial spaces.
What is Trust?
Robert Putnam and Francis Fukuyama think about trust as epiphenomenal with social capital. For Putnam, social capital is a "feature of social organizations ... that facilitates coordination and cooperation for mutual benefit." (30) Fukuyama goes a step further, arguing that social capital consists of norms or values, "instantiated in an actual relationship among two or more people, that promote cooperation between them." (31) On a micro level, social capital constitutes the advantages and benefits that individuals realize owing to their connections with others, like coworkers learning from one another and cooperating to achieve a goal (32) or social groups from diverse backgrounds whose experiences are enhanced because of their diversity. (33) Social capital also develops on a more macro level, among individuals in larger communities and nations and even among nations and peoples. (34) In all cases, social capital refers to the good things that develop out of our connections to others.
Trust is one of those good things. Trust is a resource of social capital concerning the expectations that others will behave according to accepted norms. (35) It is the "favorable expectation regarding other people's actions and intentions," (36) or the belief that others will behave in a predictable manner. For example, if I ask a friend to hold my spare set of keys, I trust she will not break in and steal from me. When an individual speaks with relative strangers in a support group like AA, she trusts that they will not divulge her secrets. (37) Trust, therefore, includes a willingness to accept some risk and vulnerability toward others and steps in to grease the wheels of social activity in the absence of perfect knowledge: I cannot know for certain that my neighbor will not abuse her key privileges or that my fellow support group members will keep my confidences. As Niklas Luhmann has stated, trust begins where knowledge ends. (38) As such, trust allows me to interact with and rely on others.
Trust is essential online. Nearly two decades ago, Helen Nissenbaum presciently noted that trust is "key to the promise the online world holds for great and diverse benefits to humanity," including richer communities, engaged politics, and robust commerce, because "[pjeople shy away from territories they distrust." (39) That is just as true today. Corporate executives talk about ensuring a steady stream of customer data by gaining user trust and confidence. (40) Apple asks us if we "Trust this browser?" when we log in to iCloud on a new device. In 2013, Facebook conducted a study of its users to determine "how trustworthy" they think Facebook is overall. (41) The Federal Trade Commission (FTC42) and the California Attorney General's Office (43) couch their recommendations for transparency in...