When Design Goes Dark

0
140

Business leaders are quite aware of the nasty consequences of dark patterns. Regardless, they take their chances.

There is a certain evil that lurks within user experience (UX) design. To understand it, let’s look over the shoulders of an expectant mother Kelly Peters who is trying to buy a crib online.

On entering the home screen, a pop-up ad greets her asking her to sign in to continue shopping. But does Kelly know she can decline it and still browse for items? No, she chooses “sign in”.

Soon, she is looking at different cribs and while she loves model A, model B stares at her with the words “most popular” and “limited stock”. Not realizing that it may not be true, she chooses option B.

Before the final checkout process, she is interrupted with another pop-up asking her to subscribe to their newsletter to keep herself updated about the shipment and other announcements. Again, not realizing that her inbox is going to be incessantly spammed and the unsubscribing process would be complicated, she clicks on “subscribe”.

At the end of the digital shopping experience, Kelly gets duped by dark patterns like a million others every day.

Now, these manipulative design tactics might look clever but it can be downright deceitful and unethical. Misleading users or ticking them off to do the company’s bidding is not encouraged.

To generate more sales and increase lead conversion, many designers and businesses indulge in dark patterns. The term was originally coined by UK-based designer Harry Brignull in 2010, when the eCommerce industry had begun to pick up the pace.

Where do you draw the line? A wilfully crafted UX to debauch customers is a fair game for anyone to file a lawsuit.

Some seemingly harmless UX features such as autoplay have also been categorized as a dark pattern. For instance, YouTube automatically plays the next video as it is coded and optimized to endlessly serve the user with videos until the application is closed. The design element can be disengaged, but it is usually “on” by default. While some think it’s an attention-based, customer experience-driven strategy by businesses, it can backfire on young children watching cartoon videos but ten minutes later end up watching inappropriate content.

The tech giants are no less guilty

Dark patterns hit their peak in 2016 when users reported its use on several platforms including Skype, Uber, Amazon, Facebook, and LinkedIn. That year, Facebook’s newsfeed algorithm was put under investigation when dark patterns began to emerge on it. The algorithm was designed to provide genuine and fake news to users depending on their observed opinion. The dark personalization caused confirmation bias and resulted in narrowing the user’s already self-curated perspective.

When Microsoft launched Windows 10, it introduced a pop-up window for the upgrade. While the design seemed harmless and legit, Microsoft suffered a huge backlash as the upgrade was automatically started even after the user chose to click on the close (X) button.

LinkedIn was also hit with a class-action lawsuit with a $13 million penalty fee in the late 2000s. It was a result of the horde of follow-up emails that flooded user email to “expand our professional network.” Fortunately, the dark pattern was recognized and brought to attention. It is only one of the many such examples that continue to serve as a warning for organizations to steer clear of dark UX patterns to grow business.

In the early years, companies would indulge in dark patterns, but with time the usage became a routine to an extent that some companies were instituted on the foundation of dark patterns.

A Dark Library To Shame Deceitful UX

In 2019, a study discovered 1818 instances on eCommerce websites, representing 15 types of dark patterns, including sneaking, urgency, social proof, and obstruction. The study suggests that “shopping websites that were more popular, according to Alexa rankings, were more likely to feature dark patterns.”

Meanwhile, the official dark patterns website created by Brignull himself includes a section called “Hall of Shame” that lists all the identified dark pattern activity in the form of tweets. While he identified a dozen types of these patterns, here are our top three picks.

Triggering FOMO (Fear of missing out)

“Only one left”

Most eCommerce websites use this dark pattern to manipulate shoppers to buy the item before it’s out of stock). Many users believe the lies and succumb to FOMO.

Roach Motel

“Why unsubscribe? It’s a painstaking process. Choose the simple disable notifications option instead”

Subscribing or creating an account on the website is an easy, two-step process. But when the user wants to unsubscribe or delete the account, it is an arduous interface navigation process that frustrates the user, who is forced to continue using the services. Users who have tried deleting their Facebook account would know.

Bait and Switch

When X is not X. X is Y.

Fake data is presented to the customer in the bait and switch dark pattern model. The information may look like what the customer is looking for, but when they click on it, the data means something else altogether. For instance, the “know more” option might sound like the customer wants more details about the product, but it redirects the user to the create account page or the subscription page.

Battling it out for better CX

Jennifer King, a privacy and data policy Fellow at the Stanford Institute for Human-Centered Artificial Intelligence, said, “The Federal Trade Commission already has broad powers to regulate deceptive business practices and could right now take action to outlaw some very specific marketing practices online – the worst of the worst, essentially. And various states’ attorney generals also go after companies where the site design or the interaction design is outright deceptive. But it’s unclear whether existing laws support FTC or state action around more subtle uses of dark patterns to coerce and manipulate users online.”

The US government introduced the Deceptive Experiences To Online Users Reduction (DETOUR) Act in 2019 to end dark patterns. “Obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data,” states the Act.

Recently, the US Lawmakers re-introduced the dark patterns bill to prohibit the growing usage of dark patterns. The Act bans businesses that have over 100 million users from designing interfaces that hurt consumer decision making, autonomy, and privacy.

Due to the vague but vast possibilities of the concept, another bill in California bans organizations from using dark patterns to obtain consent to data collection. It will go into effect in 2023. Detailing specific examples, the bill bans the use of double negatives such as “don’t not access my private information”.

Strengthening the 2018 California Consumer Privacy Act (CCPA), California enforced a law, in 2021, wherein businesses using dark patterns will receive a 30-day window to set their UX straight or suffer civil penalties under unfair competition related laws .

Meanwhile, a European Union committee recently proposed measures to protect consumers by banning tech giants such as Google and Meta from the use of dark patterns and limiting ad targeting to minors. These measures will be added to the Digital Services Act that regulates online content.

Well, these laws are a start. But being privacy-focused, it has a long way to go.

The ideal design scenario for businesses

Business leaders and designers are well-aware that dark patterns can bring along negative consequences causing higher churn rate and decreased brand valuation. Instead of crowding sales funnel, a manipulative UX will increase chances of abandonment. Yet, many take the risk of hoping for the best. While it seems like a case of probability, it is more about being clever and trustworthy to increase customer loyalty.

The words seamless, ease-of-use, clarity, and fun interactions define good UX design. Being a critical part of UX, experts strongly encourage companies to invest in research and customer behavior learning.

Apart from introducing firm design practice standards, it is vital for leaders to infuse empathy into the design. An ethical code of conduct with privacy, trust, and decency as core elements need to be practiced by designers. With enough research on customer behavior and what they want, it is possible to create a transparent and an ethical UX.

While unethical customer acquisition can get businesses to climb the ladder, only loyalty and a high rate of customer lifetime value can manifest true success.

The dark pattern weapons

Colin Gray, an assistant professor at Purdue University who studies dark patterns, believes that customers might actually be aware of the manipulation tricks but rarely know how to escape or respond to them. Several researchers are developing tools that can help customers put an end to dark patterns.

A University of Oxford student Konrad Kollnig is fighting dark patterns with a new tool called GreaseDroid that can remove deceptive designs from Android applications. It allows users to apply written coding instructions called “patches” to their apps via a web portal. While it’s not commercially available yet due to legal and Android-restricting issues, Kollnig successfully demonstrated its potential by making it remove two Twitter dark patterns.

“This proposed system, even as a rhetorical device, is useful to unpack what kinds of rights consumers should have, and how these rights might intersect or conflict with the rights of app developers or platforms,” said Gray.

Meanwhile, voicing a concern about the tool, Professor Jason Hong said, “A user is essentially installing arbitrary code onto their device, which can be highly risky. The patch can make the app do anything, and you don’t have the protection of the Google Play store anymore.” The GreaseDroid team agrees and is currently working on it.

The Digital Civil Society Lab at Stanford has launched a Dark Patterns Tip Line where users can report dark patterns that they come across online. The Lab also plans to collaborate with civil society organizations to push the identification process further.

After a year of collecting reports, the Stanford team will analyze the data, and issue a detailed report to researchers and policymakers who can help eradicate the dark patterns across the internet. Additionally, Stanford University plans to introduce an undergraduate course on dark patterns, which will cover several segments, including human-computer interaction and cognitive theory.

A decade ago, the electronic frontier foundation had enough of Facebook’s manipulative user interface that the coercion was labelled as Privacy Zuckering, and the name is still valid today. Although the business industry is marked with years of scandals and billion dollar fines, the dark pattern concept continues to exist. However, with the upcoming laws and technological tools, the darkness might fade away.

If you liked reading this, you might like our other stories

CXO’s Role Is Changing? Maybe Not
The Brick and Click Blend for Retailers