Date22 March 2023
AuthorMichaels, Jordyn

    Nothing in life is truly free - everything can be considered a transaction. Websites and social media platforms that are accessible without a monetary fee offer information and communication tools that drive people to exchange information and to be entertained. However, even if they appear to be free, there is more likely than not a hidden cost to the consumer that may not be explicitly made aware to them - their data. Data is so valuable that it is worth it for companies to establish free platforms with content users desire, in exchange for access to their data. Such companies have developed deceptive user interfaces to maximize as much consumer data collection as possible while users are interacting with their product.

    Deceptive user interfaces, in this case, commonly known as Dark Patterns, have plagued society for decades due to a lack of education and general awareness on the topic. As consumers and regulators begin to understand the lack of control one has over interfaces and consumer data, a movement has been ushered in to reclaim interface autonomy. This note will delve into the history of the development of Dark Patterns, and the current status of pending and enacted legislation. Finally, this note will explore the most feasible pathway for regulating Dark Patterns - the Federal Trade Commission (FTC). Moreover, this note will also discuss the need for increased consumer education in digital privacy literacy, as consumers, now, largely do not have the capacity to hold companies accountable since they are lacking the necesary background and vocabulary to examine these tactics. Restrictions on Dark Patterns need to occur sooner rather than later and can begin through established methods within the FTC. If not, companies will continue to develop their methods into interfaces that will be far too sophisticated to be clearly assessed through the intergration of artificial intelligence and machine learning.


    When computer programmers create interfaces on website platforms, they are developed according to the way they want the consumer to interact with its services. (1) Interfaces are website design processes that determine how a site is formatted. Such interfaces dictate how consumers access certain features that will lead them to specific information or opportunities to act on. (2) Dark Patterns are a class of tactics used in interfaces with the purpose of coercing the user to unknowingly interact with a website in a way that benefits the company or developer that created it. (3)

    Although the use of Dark Patterns existed since the dawn of the internet, its ancestors have long been integrated into consumerism and advertising. Starting in the 1970s, behavioral economists set out to understand the irrational decisions and behaviors of people, specifically in the realm of purchasing goods. (4) During their research, they discovered a set of indicators, such as a lack of information of the true market value of a product, that offered explanations for these behaviors. Eventually, researchers began to test such indicators to try to push people to make decisions they may not have consciously intended to make, eventually coining this concept as "nudging." (5) Business professionals began to apply this research through "choice architectures" as an attempt to dictate and control what their consumers were doing. (6) These practices became common place in retail businesses throughout the late 20th century. (7)

    In 1996, such tactics entered The Web when Hotmail began to apply these concepts to expand their user base through a concept known as "growth hacking." (8) At the end of all user emails, Hotmail programmers automatically added the message "Get your free email with Hotmail." (9) This created a free and direct advertising opportunity for Hotmail as users were unintentionally promoting their product every time users sent an email. This tactic was an enormous success and allowed Hotmail to generate many new users. (10) However, growth hacking can cross a line into illegal predatory behavior, as was the case in a 2014 class action lawsuit. (11) The professional networking site, LinkedIn, settled a class action lawsuit as a result of such tactics they deployed between 2011 and 2014 where they accessed users' contacts through their connections on the platform and automatically sent those contacts invites to try features on the site without the users' permission. (12)

    The term Dark Pattern was first devised in 2010 by expert user experience (UX) designer Harry Brignull, that defined it as "a user interface that has been carefully crafted to trick users into doing things such as buying insurance with their purchase or signing up for recurring bills." (13) Dark Patterns gained widespread use over the past decade once website developers decided to apply them in tandem with the behavioral research that bore Nudging. (14) Developers tested such deceptive interfaces in real-time by setting up different kinds of websites for their users to see which ones users were most receptive towards and were providing the results they wanted. (15) Once there was enough data collected, developers could use artificial intelligence to further develop interfaces that would give them the most ideal results with consumers.

    In addition to their use in advertisements and e-commerce transactions, Dark Patterns are also used to spread misinformation to undermine democracy. (16) Dark Patterns were used to spread misinformation throughout the 2016 election cycle. (17) On social media platforms, most specifically, Facebook, news articles were placed onto users' feeds according to how the algorithm would predict which content the user would find most favorable (according to their data), causing the interface to show only articles that support their preferences. (18) These articles were pushed regardless if they were based on verifiable facts. What made these articles so dangerously effective is that they were designed to appear as if they were real news websites. (19) The misleading, or even all out inaccurate, headlines in the articles were designed to deceive users in order to drive user engagement. (20) Further, Facebook deployed a deceptive interface through their presentation of news articles that displayed them all in the same way, despite if they were a credible source or not, making it difficult for users to verify the source before they click on the link to view the full text. (21)


    There are eleven main types of Dark Patterns currently in use. These include: Bait and Switch; Disguised Ads; Forced Continuity; Friend Spam; Hidden Costs; Misdirection; Price Comparison Prevention; Privacy Tuckering; Roach Motel; Sneak Into Basket; and Trick Questions. (22) Generally, most of these tactics on their face have been deemed to be legal unless they rise to a certain level of fraud or coercion. (23) Nonetheless, all can at least be considered morally questionable.

    Bait and Switch is a configuration where users are attempting to interact with the website to make one decision but, end up being redirected and hoaxed into making a completely different decision. One example is a 2016 Windows 10 pop-up that activated when users tried to close a window to stop a software upgrade by clicking the red "x" in the corner (which typically signals for the window to close), would instead start the upgrade even though that is not what the user intended. (24)

    Next, Disguised Ads are ads that are integrated within a website that look like they are normal features of the website but, are not. Often these appear as side bar ads that ask consumers to download things that may appear to be an extension of content on the actual website when really, it is a separate download from an advertiser. This is supposed to make users accidentally click on the advertisement, causing them to interact with it more often because it appeards it is just regular website content. (25)

    With the rise of online subscription services, Forced Continuity is a pattern that appears all too frequently. This is when users, who think they are consenting to use a service for a free trial, are mandated to input their credit card information to access it. When their free trial ends, there is no easy way to cancel their subscription, thus kicking in the paid subscription that was not intended. Many popular subscription services such as Blue Apron and Ipsy have allegedly used this tactic. (26) Some businesses use this method to prevent consumers from taking advantage of free trials, arguing that this is a way to only attract consumers that are serious about trying their product. (27) This tactic has also been used in online campaign fundraising and political action websites as well. Up until shortly before the 2020 election, Donald Trump's presidential campaign was using forced continuity patterns to automatically sign donors up for reoccurring donations. (28)

    Next, Friend Spam is where a platform asks permission to access the user's email or social media with the intention of finding friends who also use the product but, really uses this information to push spam about the products to such contacts. (29) This is the conduct that required Linkedln to settle the class action brought by users. (30)

    Hidden Costs is a pattern where the user is forced to go through multiple hurdles before seeing the final purchase price when purchasing an item online. (31) Throughout this process, hidden charges appear that were not included in the original item listing, such as services fees, shipping, etc. (32) A shocking example of this tactic is through Turbo Tax's scheme where they advertised a free basic tax filing service but, in reality there would be added costs the would only appear after one entered their infonnation and navigated through screenings. (33) Moreover, there was actually a truly free version of this service TurboTax developed through their contract with the...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT