User Experience designers have a responsibility of helping tech be accessible, equitable, and sustainable, but instead are continually used for unethical practices for profit. As UX designers, we are responsible for the impact our actions have on the world. We can no longer hide behind Product Owners’ decisions, marketing briefings, or technical specs to offset our obligations towards the user's well-being and the ethical implications of our actions.
As a User Experience (UX) professional the main focus is often on a single service moment or “touchpoint.” It focuses on the users of the service at that moment, typically a digital one, such as a web application. As a Human Centered Service Designer the focus is much broader. It is the whole service, with its many service moments before, during and after the service. These moments are interactions between the customer and the service provider and can be digital, physical or interpersonal. Taking a deep look at both the elements that are visible to the customer and the elements that happen behind the scenes with the goal of creating a better user experience.
For those designers in a leadership position, we have come very far to be at the table. We have fought every inch of our way to gain more authority, influence, and visibility in the decision-making that leads to the development of products and business value. With the benefits of this new position, we also gained the responsibility of addressing unethical practices and the repercussions.
Capitalism: An economic and political system in which a country's trade and industry are controlled by private owners for profit, rather than by the state.
Under capitalism, the individual capitalist introduces technology and improves productivity in order to increase their own individual profit, not for the betterment of the user. Inequality is much more than a side-effect of free market capitalism. Capitalism has been incredibly successful at boosting wealth, but it has failed at redistributing it. Today, without a push to redistribute wealth and opportunity, our model of capitalism and democracy is leading us down a dark path.
Ethical Design needs to take a central place in the discussions of the role technology has on preserving (or eroding) human rights, radicalized social dynamics, and redefining personal locus of control over the user’s privacy. As a community, we need to be ready and actively promote those difficult conversations.
Dark Pattern: This term, coined by Harry Brignull, is used to describe interfaces that are crafted so that the users are cheated, misdirected or blocked from achieving a reasonable goal.
Dark Patterns create inequity of the power balance in a system over time. Leveraging one user of the product over another. Whether we are aware of it or not, we have all encountered dark patterns. They are user interface (UI) elements that have been carefully crafted to trick users into doing things they might not otherwise do, like adding insurance to an order or signing up for recurring billing.
Often dark patterns rely on the fact that we are not giving a site our full attention to slip things past us. But at other times they use manipulative techniques based on psychology to pressure us into taking the desired action-techniques such as scarcity, social pressure or fear of missing out. Etsy has adopted these techniques by highlighting the number of people who have added an item to their basket and the limited stock for that item. These user interface elements leave you feeling that if you don’t act quickly, you will miss out.
Despite being widely condemned as unethical, dark patterns persist and are, if anything, growing in popularity. That is because they do, in fact, work. As marketers and other digital specialists find themselves under increasing pressure from senior management to meet targets, it is hardly surprising that they turn to dark patterns as the answer.
These manipulations are also in our physical space, ranging from slanting benches to metal spikes, “hostile architecture” occurs when elements of the built environment are specifically designed to curtail “undesirable” use. Usually, the groups targeted by “hostile architecture” are homeless people looking for somewhere to rest or teenagers looking for somewhere to play. Not only does this practice contradict the main tenets of public space (i.e., accessibility, freedom of usage, inclusivity), but it is likely to also lower the quality of the space in general. Whether you call it hostile architecture, defensive design, or exclusionary planning, it is used to alter human behavior and limit the ways in which an object can be used.
The powers that be with the above average pay grades, often do not understand tech enough to see they have crossed a line, or they simply don’t care because they are worried about ad revenue, key performance indicators and deepening pockets. We need to add liability to their list and hold accountable.
On 10/5 Facebook whistleblower Frances Haugen testified before Congress, telling a Senate subcommittee that the social media giant is putting,
“Profits before people, There’s no one currently holding Mark accountable,” she said, referring to Facebook CEO Mark Zuckerberg. In particular, she highlighted Facebook’s engagement-based ranking systems, which determine who sees what on the platform. “The algorithms give greater weight to content that elicits the strongest reactions, thereby boosting increasingly extreme posts.”
Haugen also warned that Facebook relies too much on artificial intelligence to screen out harmful content, the vast majority of which evades detection.
The problem with Facebook’s products is not that they host user generated content. It’s that they use machine learning to show us the content that Facebook thinks we want to see in order to keep us on the platform longer and sell more ads. What Facebook sells is not an online message board where people can express themselves, it’s surveillance-driven algorithmic manipulation that’s maximized for engagement.
With that in mind, let’s set aside the ethics for a moment and consider the business ramifications of dark patterns instead. Because, although dark patterns do often work, they come with long-term consequences. Consequences that are not readily associated with dark patterns if you are only focusing on key performance indicators. For example, companies who use dark patterns may notice a decline in repeat business and brand loyalty over the long term- but that can be fixed with a rebrand right?
Let’s say you have two competing companies in a very saturated market. “Company A” finds out that “Company B” has developed a new upselling strategy in their checkout that drives 20% more conversions. Company A imitates that pattern and also adds a dark pattern that forces the users to believe that the item they are seeing is the last one and the other 10 people are interested in buying it. Sales motivated by fake scarcity happen to increase the other 20%; Company B notices it, copies it and adds another shady UX scheme to manipulate users. They they try to sell user experience in a qualified way to say look it works. The cycle goes on until bad practices become standard, other companies copy them without even wondering if that is the right thing to do, and soon every checkout experiences are a living hell.
Think about all those political institutions, banks, airlines, retail stores, and insurance companies that have done rebranding after rebranding, trying to wash away their bad reputation, without noticing that their practices, behaviours and values are the ones that need to be “redesigned,” not their logo. Manipulation works… until it doesn’t. For every market plagued with abusive user experiences, an upcoming startup is gaining market traction, faster growth and more enthusiastic user engagement with a simple yet revolutionary idea: Treat the user nicely.
Consumers are still seen as just numbers and the foundation of UX is not numerical data-It is people.
It might look like success as a number but someone may be paying the price. Data, nominally an invisible entity, is beginning to be felt by all. There’s always been dystopian, kafkaesque concerns about the reduction of humans to data points, but it’s only now that we are beginning to see these concerns be truly and harmfully reflected in nearly every action we take. This isn’t a dystopia, or a totalitarian regime bent on societal control, it’s simply the order of the day for capitalism. Capitalism seeks capital and capital can only be understood through the quantified: it can only be represented by numbers, not by quality. Flattening ‘things-in-the-world’ such as quality, knowledge, concept or people into numbers is hugely advantageous for capitalism because it allows for easy processing and movement of gaining more capital.
Tech is founded on this-value is quantified, but so are all problems and solutions. The ability to measure, optimize and strategize solutions is unparalleled. Tech claims that all of our social ill’s can be ‘solved’ by clever enough application of 1’s and 0’s. Folks in the User Experience field, from research, content and design are all advocates of the user, however they are forced to play by the rules of the quantified, the bottom line, data. There is a lot going on right now with the use of data and its application to humanity. However what's not being discussed is the manipulation of UX for Business and ad revenue needs , putting the dollar before the actual user. Examples
Home buying apps are leveraging and buying and homeowners are losing out. On top of this 75% of america can not afford a home anyway.
Social media’s algorithms are under fire for manipulating elections and polarising political discourse, and continue to encourage hate speech for profit.
An unregulated, data driven gig economy is increasingly seen as inhumane and anti-labour
We seem to have a goal of eliminating the user, by designing to allow data and Ai to make personal decisions for our lives. I’m not saying that many of these tools, apps, and other technologies are not hugely convenient. But in a sense, they run counter to who we are as human beings.
User Experience is Human Centered
User Experience is founded on people being-in-the-world and noting their experience. It is user centered & human centered design. It often feels as if UX acts as a Market intelligence interpreter or seen as a method of validating products for market intelligence needs (often to match findings of a competitive analysis). An experience is not a quantity, it’s a quality and often we forget to look at the impact and ripple effect of our products.
We can try to put metrics next to experiences like happiness or frustration, but you don’t feel a “3” on a scale of frustration, you feel what you feel. Given an opening, you might talk about qualities of your experience which may or may not include happiness or frustration but may involve other emotions, themes or observations.
How you construe and reflect on meaning from an experience is severely constricted by the quantified researcher-defined parameters. There seems to be good-faith efforts to address this everywhere I have worked, but the problems are glaring and deep-rooted. Again and again, when I do UX research and analyze themes or concepts, I’m asked for the “data” ignoring using the users quotes and insights that supports my analysis.
Being emerged in a contextual inquiry, or conducting qualitative user testing allows you to notice trends and themes by carefully noting the meaning behind people’s actions, words and understandings. Analysis such as this doesn’t result in numbers, numbers may play a part, but the overall analysis looks to understand the depth, breadth, and relations of concepts. And these concepts might move between levels of granularity or rely on a number of variables (facial expressions, tone, body language etc.). All of this means that there is no single number — or there shouldn’t be — in most forms of qualitative UX research.
Yet the quantified underpinning of capitalism forms our frame of reference, as the realm of the quantified defines what we can and cannot do. In other words, our creativity and its resultant output are restricted. We think in terms of optimising local areas of systems. We think of increasing conversion. We think we can solve social ills with enough 0’s and 1’s.
UX professionals are an advocate for the user, not the business. Indeed, that’s where we are most effective. Yes, I am paid by the business to make the best possible experience for that product, but the best possible experience for a user and the best possible experience for a product are not the same thing. In this way business needs and user needs often conflict supporting one side of the system more than the other. The UX team provides a seat at the table for the user, and a voice for the ones not invited— that’s why we are here.
Capitalism demands money and it demands metrics to show how a user’s experience is improving their money-making. Money is a quantity which only understands other measurable quantities. And a quantity is only measurable when it becomes a variable that must be made (seemingly) objective and generalized by defining a set of parameters which determine an instance of that variable. Yet experience is personal, subjective and continuous.
This is why it can be difficult or inappropriate to think about a single product’s UX. A UX designer must erect artificial boundaries around the context of investigation — conversion becomes the ultimate arbiter of an experience, not the actual quality of an experience, and to understand conversion, we have to measure. Our ability to examine someone’s experience of the world degrades because we can’t engage with the full range of experience because of the demands of quantification in relation to capitalism.
This is because it’s simply far more profitable to facilitate the finding of content than to help create frameworks to support personalized systems of information. Every UX professional will tell you of the importance of personalization, way-finding and sense making — all qualities that could be engendered positively much more effectively if we focused on personal, curated informational systems about relationships between humans, and their physical environment. These concepts don’t exist in isolation amid the artificial boundaries of URLs. They cross channels, cross into our brains, and into our lives.
None of this is to say that understanding of quantity is useless. In tech, quantification can tell us, in dead tones, how much of things, like interactions , downloads or hits. It can tell us about routes taken, objects clicked. It cannot help us with vital issues of experience that exceed the parameters of measurable quantities, such as:
How can we help you build your life in the way you want it to be built?
What are ways that we aren’t supporting you in doing something that you need supporting with?
What meaning do you make out of your interactions and experiences with an activity you do?
What do you understand from your interactions with a particular area of your life?
What is the context of your experiences and interactions?
And simply, how can your life be better?
The answers to these questions can’t be bound to the variability of a single — or even multiple — measurable quantities, measured within the use of a single product. Indeed, qualitative answers to these questions may point to the fact that you shouldn’t use a product in question, or might even show that we should scrap certain digital products given how damaging they can be to our mental well-being.
How we consume, how we prioritize incentive-based structures over all others, and how we build our economies needs to change for one. I don’t need to explain why there are millions of other reasons that this needs to change as well.
On an individual level, we use the world to help us remember, think and be creative. Browser tabs are memories embodied. Emails are externalized lists of activities we have to do. How we formulate intent and use our world helps to define us, which can only be explored qualitatively. We can’t think of software and the web as individualised elements with defined parameters, but rather part of systems that are us, that contribute to forming and creating further needs, emotions and states of being.
Global warming, political polarisation, fake news — these are all issues that require qualitative and systems-based thinking to understand how best to solve. It’s well researched, involving fields such as philosophy, cognitive science, archaeology, and human-computer interaction and systems theory. Imagine if those in the UX field and workers of all stripes could work across digital and physical ecosystems to creatively qualitatively impactful experiences, rather than increasing the quantifiable measurement of a small part of a single one for monetary value.
Design’s social influence
If you are part of a society, your simple presence is an act of influence to others. Design, by definition, is an act of positive influence. We introduce affordances to suggest how something needs to be used; we optimize flows to reduce user friction and favour practices guiding the user across various actions. Nothing inherently wrong with that.
You can argue that if a design does not intentionally influence the user, it just lost its reason to exist. When we talk about design principles like guidance, error prevention, and user empowerment, we are persuading the user towards his/her already established desired goals. Stewardship of the users’ objectives, mindfulness behind our business goals and awareness of our actions need to be part of any design review and feedback session.
Manipulation is the dark side of social influence. You can be a good or bad influence, influence follows a well-structured conduct code between two entities; manipulation blurs those lines.
We start with an often overlooked fact, users are not oblivious to this manipulation. Uber had to address a storm of negative publicity concerning the ways it manipulated drivers into working extended hours. Then there is Facebook (again) who the media has repeatedly accused of encouraging users to become addicted to their feeds, not to mention their experiments on manipulating people’s emotions. The sad thing is that it was 2014. Here we are in 2021 and have Instagram internal research: ‘We make body image issues worse for one in three teen girls. Of course, not every company who uses manipulative techniques on their sites get called out for it on the scale of Uber or Facebook. However, even if the backlash is limited, there is still a price to manipulation, and that is buyer’s remorse.
There are boundaries designers need to set in place to ensure that the power we exercise over the user has checks and balances embedded in the systems, practices and behaviours. We need practical ways to detect manipulative practices and asses unethical behaviours. It is easy to get lost in the semantics of what is moral, ethical or legal.
Are human values absolute or relative? Is free will a thing? In practice, this dialogue could be used to introduce confusion. We need to keep it simple? We are designers, not lawyers, so always act in the best interest of your users and the common good. I’m not particularly in favour of any other socio-economic framework, but we have to be able to imagine alternatives. It has to start somewhere, and imagining an accessible and equitable-based world rather than a quantity based world is a start.
It’s a place that UX folks know well and are predisposed to.When we begin to uncouple from the quantified, from capitalism, our horizons shift and our gaze follows, enabling us to see patterns, themes and causal structures that were otherwise invisible.When we see qualities, we begin to see how things are connected, and how we form meaning in relations to other things.
Recognizing manipulation
It seems like we describe the current use and abuse of social media, especially its effect on younger generations. Still, there are many examples I can pick from outside usual suspects. Basically, manipulation happens if there are ways to use your community to coerce you into remaining subscribed or involved with a product.
As users, one of the most dangerous things about manipulation is that it is tough to detect when it happens. We know that we have been victims of it when we feel confused, resentful, frustrated or angry at the end of an interaction.On the other hand, as designers, if your briefing is actively trying to distract, hide information or manipulate data, so users are intentionally tricked into perceiving a distorted reality, you should raise a red flag right away. Service Blueprints and user journeys are great tools to detect the inflection points where the correct information needs to be displayed and tracing back the origins of any deceit.
If the user does not comply with the product’s demands, the results make the user feel invisible, unrecognized or diminished. Repeat this multiple times, and the users’ sense of self will start deteriorating and concludes that his/her input does not matter.Those are the basic mechanisms of voter suppression, and sadly we get to see similar practices in design patterns, feedback tools and crowdsourced data visualization. If the targeted demographic already physically exhausted, mentally distressed, exited or exhausted? User research should not be about finding “the right buttons” in the users’ mental models to exploit them for private gain.That is a red flag.
With the rise of neuromarketing, increased biometrics availability in devices and normalization of data mining users’ behaviours, the tools for manipulation have become more advanced and potentially more dangerous. It is hard to estimate the extent of the damage in modern democracies the negative influence Cambridge Analytica had.
The long way ahead for Design Ethics
Battling manipulation in UX requires rendering those bad practices ineffective by teaching the users about dark patterns, raising awareness of its prevalence, not complying with internal or external pressure to use them, and bringing real innovation driven by design thinking, and user driven research. As design leaders, team members and individual contributors, we cannot turn our attention away from our designs' potential damage. This is our opportunity to elevate the conversation, start early before bad practices become normalized or the reputation of UX as the profession you love and respect become tarnished because of the harm it continued to push.
We need to elevate design ethics to a more practical level, and identify ways to make ethics not an afterthought, not something to be considered separately, but rather something that’s so ingrained in our process that not doing it means not doing design at all.It’s easy to have moral principles, yet in the real world, with the constraints that our daily life imposes upon us, it’s seldom easy to act according to those principles.
We might simply say it’s inconvenient at the moment. That there’s a lack of time or budget to consider all the ethical implications of our work. That there are many more pressing concerns that have priority right now. Mostly, we are simply unaware of the possible consequences of our work. The only way to overcome the “inconvenience” of acting ethically is to practice daily ethical design: ethics structurally integrated in our daily work, processes, and tools as designers. No longer will we have to rely on the exceptions among us; those extremely principled who are brave enough to stand up against the system no matter what kind of pressure is put upon them-because the system will be on our side. By applying ethics daily and structurally in our design process, we’ll be able to identify and neutralize in a very early stage the potential for mistakes and misuse. We’ll increase the quality of our design and our practices simply because we’ll think things through more thoroughly, in a more conscious and structured manner.
If you happen to work in an environment where despite your best efforts to expose and change those behaviours still user manipulation is rampant, you need to assess if these practices are ego-congruent, which means, the company produces manipulative experiences because it fits with its worldviews, therefore there is no inner conflict or shame to leverage change. If that happens to be the case, the best alternative for you is to move on and find a company with a healthier culture. Otherwise it becomes a struggle with imposter syndrome. You know best practice, that’s why you where hired, however you are often being taught in house ways and expected to deliver bad practice.
It will most likely be easy to walk away from those kinds of companies since they tend to be manipulative to their employees. Use the same questions, find traces of bad behaviors, and develop an escape plan right away. There are a lot of company’s selling the idea of ”Design thinking” and throwing words like “innovative and creative and human centered, but in the end they are gaslighting you to practices that are ill. Live your values every day, and put the good fight in your designs.