Marsh & McLennan Advantage Insights logo
Conversations and insights from the edge of global business
Menu Search

BRINK News is transitioning to This Moment platform on MarshMcLennan.com as of March 31, 2023. Read the update here.

Technology

How Do You Know When You’re Being Manipulated? The Dangers of Dark Pattern Design

Consumer decisions will never be made in a vacuum of logic and objective assessment. The psychological influence on consumer choices of user experience (UX) design is more than we realize, and user interfaces (UI) have been hijacked to steer users into taking unintended actions.

These hijacked designs are called dark pattern designs.

By using specific psychological techniques, designers can create UX and UI products that deceptively guide users into making decisions, often without their full understanding.

Tactics such as strategic visual design and persuasive micro-copy can manipulate people into making unintentional purchases, giving consent to invasive privacy settings or spending more time on the application than needed.

And they’re not restricted to the fringes of e-commerce sites, either: A recent study found more than 1,800 instances of dark pattern use on 1,254 online shopping websites.

Big Tech’s Use of Dark Pattern Designs

Big Tech behemoths are no stranger to using dark pattern designs to further their interests.

There have been examples of using dark patterns to conceal an “opt out” data sharing option, burying the alternative below the lengthy list of Terms and Conditions. Other design tricks include highlighting an “Accept and Continue” button in blue, leaving the alternative in white, where a simple “Decline” button would have sufficed.

The exploitation of people’s personal interactions and information that has taken place on social media platforms has had consequences on populations’ perceived freedoms and liberties, as we saw with the Cambridge Analytica scandal.

We can see the use of dark design across other services, too. Homestay and lodging companies have been hit with criticism for the ways in which they display per-night pricing, often excluding costs such as cleaning, service fees and tax until the final booking process. 

Other tactics include burying the option to cancel an account deep in a company site’s architecture, forcing you to go through pages outlining all the services you’ll be missing out on.

The Push for Regulation Is Growing

As these techniques become increasingly widespread, legal steps have been taken to curb the use of dark design practices among tech companies.

Last year, the DETOUR Act (Deceptive Experiences to Online Users Reduction Act) was put before Congress, which would restrict irresponsible design use on big web platform holders with more than 100 million active users.

Dark patterns and data privacy issues go hand in hand, so it’s vital to understand the interlinking of the legislation that looks to regulate both of these potential threats.

If passed, the act would make it illegal for companies to “design, modify or manipulate a user interface with the purpose or substantial effect of obscuring, subverting or impairing user autonomy, decision-making or choice to obtain consent or user data.”

Actions like this, along with legislation like GDPR in Europe and the upcoming California Consumer Privacy Act, signal the dawn of a new information economy. While it’s difficult to predict how Big Tech will respond, some big players like Microsoft and Mozilla have actually come out in favor of the bill.

Will Regulatory Action Be Enough?

At the moment, few companies are likely to move much beyond surface-level principles or manifestos. Such principles often lack deeper accountability and fail to drive tech companies to comply with truly non-deceptive, ethical design that delivers a completely clear and fair picture to the user.

And we can expect the drive for data to continue unhindered: Data is an invaluable resource for Big Tech that the vast majority of companies will not be willing to compromise on.

As we can see with the first big crackdown of GDPR on Big Tech in Europe, which is being delayed by a procedural complaint submitted by WhatsApp’s lawyers, it’s not going to be simple to convince tech companies to comply, especially as they can easily stay afloat amidst financial penalization.

Despite this, there is increasing recognition of the impact and importance of personal data.

Politicians and citizens alike are waking up to abuses: We recently saw Google get hit with a record $170 million fine for violating children’s privacy on YouTube, while Amazon, Apple and Google have all faced complaints about their alleged failure to comply with GDPR this year.

Dark patterns and data privacy issues go hand in hand, so it’s vital to understand the interlinking of the legislation that looks to regulate both of these potential threats.

Risk of a Consumer Backlash

While regulation is necessary, it’s not just the job of governments to curb dark patterns. Tech companies ultimately need to embed this sense of responsibility to the user as deeply as they embed the need for business success through design.

UX and UI design teams need to ask important questions: How does this design influence user behavior? How does extensive data collection infringe on the rights of individuals? Neglecting these issues will not only result in legal consequences: As consumers become increasingly aware of how they use technology and how tech companies use their data, organizations leveraging dark design and misusing user data put their reputation on the line and risk alienating consumers.

Collaboration on Ethics

Foundational change that ensures respect of users’ rights may even mean collaboration between designers and regulatory bodies to develop ways to move toward applied ethics. It’s vital for both users and these digital teams to recognize the duality taking place: On the one hand, we enjoy the opportunities and convenience given to us by digital apps. On the other, we must carefully assess the risks posed to individuals and societies as a whole and ensure that our civil liberties are protected from the long arm of data-hungry Big Tech.

Going forward, understanding this duality will be even more crucial in preventing future generations from blindly interacting with technology, while a stronger focus on responsibility in design will help them maintain this control.

Only with collaboration between regulators and ethically acting digital teams will dark patterns lose its normative status to push back the tide of unethical practices.

Johannes Dornisch

Lead UX Designer at intive

Johannes Dornisch is the lead UX designer at intive. Focused on finding the most effective solutions where people, technology and business intersect, Johannes received several awards including the Red Dot Design Award and the UX Design Award in the product category from the International Design Center Berlin.

Get ahead in a rapidly changing world. Sign up for our daily newsletter. Subscribe
​​