Scroll Top

CONSUMER PROTECTION AND DARK PATTERNS

INTRODUCTION

Dark Patterns is an umbrella term used to refer to the petty methods and strategies used by numerous websites and apps to lure unsuspecting people into using their services by making it tough for them to opt out. Users are frequently unable to complain about such methods since there are no rules or laws in place, either online or offline, to prevent site and app corporations from engaging in such deception.[1] It’s a user interface that’s meant to take precedence over the user’s own judgement and cognition to make a decision to get people to do things they wouldn’t normally do or in fact, make it so complicated to get out of it that the user ends up giving up and eventually continue with the service in question.  These sales techniques encourage users to make decisions that are counter to their best interests, such as purchasing products or services that are not helpful to them. This is often encouraged by utilising dark patterns that obscure important information for decision-making, such as the true costs of the product, adding irrelevant alternatives to distract consumers, and guilting people into making a decision that is not in their best interests. Dark Patterns are of various kinds and we as users encounter them on a daily basis. Some examples of these include very conveniently not mentioning the fact that the “free trial” will on its own turn into a “paid subscription” and the money automatically be deducted from your account unless you specifically opt-out the process (which again, is easier said than done), adding a “free membership” along with your shopping cart, already selecting a default option, giving options such as “continue” or “accept the risk” thereby instilling fear in the minds of the consumers.[2] 

Dark patterns are also employed to keep customers engaged in continuing support by making it more difficult for them to cancel or opt out of present subscriptions. The dark design ‘Roach Motel’ is used to keep people subscribed to services by having them walk through multiple pages before opting out. It’s similar to the original maze, were getting into the maze is simple but getting out is considerably more difficult. 18 When creating interfaces for cancelling subscriptions, deactivating accounts, or unsubscribing from a mailing list, ‘Roach Motel’ is employed. Another not so popular example that shows traits of dark patterns is deactivating or deleting Facebook and Instagram accounts. A user cannot simply deactivate their Instagram account from the Instagram App but has to login through a browser and deactivate and in the case of Facebook, the user finds itself trapped in the myriad of steps and questions before being able to delete his or her account.[3] These complex, or rather unconventional steps to get out of the service often creates frustration among users and they postpone the doing of the same for a while only to ultimately not deactivate their account and start using these applications again. 

Dark patterns are becoming more popular and widespread as artificial intelligence techniques aimed to optimise the likelihood that a user would do an action requested by the advertiser become more widely available. With the gradual and slow yet increasing awareness about “Dark Patterns”, there is a general consensus that this practise should be stopped whether as general decision making or via enforcement of rules and regulations. Dark patterns are associated with a lack of authority and choice and have a negative and evil connotation. If you use the internet, you have probably come across a black pattern. It could be that you can’t find an unsubscribe button for a newsletter you don’t recall signing up for, or that you unintentionally clicked accept when you meant to deny because the former was underlined. It’s possible that you’ve accepted to a corporation selling your personal information since the text informing you of this was buried in pages of legalese. Dark patterns may cause minor harm when used alone, but when combined, they mean a loss of influence over users’ choices and actions.

BRIEF BACKGROUND OF DARK PATTERNS

Even while the practice of dark patterns has been well-known for decades, it was only in the early 2010s that this word became widespread, owing to the portal darkpatterns.org, which began documenting dark patterns seen on major websites. Dark patterns have been discussed by user experience (UX) specialists for quite some time. For a long time, internal forums and groups of UI and UX professionals have been actively discussing the topic.

LAWS

CALIFORNIA – RECENT BAN ON DARK PATTERNS

California in the month of March this year banned “Dark Patterns”. The usage of dark patterns is now prohibited under new restrictions enacted by the US State’s Consumer Privacy Act (CCPA). The new rules were announced on March 15 by California Attorney General Xavier Becerra, who stated that the restrictions, which are an update of some existing provisions, will strengthen the CCPA, which was approved in August 2020. The new restrictions, according to the California government, will provide website and app consumers more power and will help them avoid online corporations’ cheap traps. At least in California, such businesses now cannot use “confusing language or superfluous actions” that push users to click unneeded links, check undesired screens, or listen to sermons about why they should not leave the service.[4]

PRIVACY LAWS

Most current privacy regulations are based on the concept of explicit consent, in which a user is made aware of the means and methods by which their personal information is collected and subsequently consents to the processing. Users in India must consent to the processing of sensitive categories of their personal data under current data protection laws. Consent for the processing of personal data must be explicit and precise, and users must be provided with enough information before obtaining consent, according to India’s planned data privacy law. However, these limitations are infamously lenient, and for most Indians, this has resulted in an unrivalled dark patterned UI experience. Long privacy notices, pre-checked checkboxes for opting in, obscuring opt-out alternatives and burying crucial information in pages of legalese will no longer be an option under the new data protection framework. Instead, firms will be required to build their systems with the user’s privacy and choice in mind. California’s development is an important illustration of the state controlling product design that may not always put the consumer first. To opt out of the sale of their personal information, consumers must navigate through many panels, confront double negatives, or deal with multiple processes, according to the new legislation.

CONCLUSION

The use of privacy legislation isn’t the only way to combat dark trends. The Consumer Protection Act of 2019 in India forbids unfair commercial practices and false advertising and could be a useful tool for regulating deceptive design decisions. From a legal standpoint in India, there is a raising need for lawmakers, regulators, and legal practitioners to grasp these trends and collaborate with UI designers, businesses, and consumers to develop an ethical framework that enforces privacy and consumer protection principles.

Author(s) Name: Khushi Saxena (Symbiosis International University, Pune)

References:

[1] Luguri, J., & Strahilevitz, L. J. (2021). Shining a light on dark patterns. Journal of Legal Analysis13(1), 43-109

[2] Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction3(CSCW), 1-32.

[3] Waldman, A. E. (2020). Cognitive biases, dark patterns, and the ‘privacy paradox’. Current opinion in psychology31, 105-109.

[4] Gray, C. M., Santos, C., Bielova, N., Toth, M., & Clifford, D. (2021, May). Dark patterns and the legal requirements of consent banners: an interaction criticism perspective. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-18).

Related Posts