💾 Archived View for gmi.noulin.net › mobileNews › 6673.gmi captured on 2023-01-29 at 04:00:49. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
2022-09-25 11:38:25
Published October 16, 2015 | By James Williams
Over the past couple of months, the practice of ad blocking has received
heightened ethical scrutiny. (1,2,3,4)
1 - https://marco.org/2015/08/11/ad-blocking-ethics
2 - https://www.bbc.co.uk/news/technology-25219922
3 - http://blog.practicalethics.ox.ac.uk/2015/10/
whats-the-moral-difference-between-ad-blocking-and-piracy/
4 - https://digiday.com/media/kant-on-ad-blocking/
If you re unfamiliar with the term, ad blocking refers to software usually
web browser plug-ins, but increasingly mobile apps that stop most ads from
appearing when you use websites or apps that would otherwise show them.
Arguments against ad blocking tend to focus on the potential economic harms.
Because advertising is the dominant business model on the internet, if everyone
used ad-blocking software then wouldn t it all collapse? If you don t see (or,
in some cases, click on) ads, aren t you getting the services you currently
think of as free actually for free? By using ad-blocking, aren t you
violating an agreement you have with online service providers to let them show
you ads in exchange for their services? Isn t ad blocking, as the industry
magazine AdAge has called it, robbery, plain and simple ?
https://adage.com/article/digitalnext/
ad-blocking-unnecessary-internet-apocalypse/300470
In response, defenders of ad blocking tend to counter with arguments that ads
are often annoying, and that blocking them is a way to force advertising to
get better. Besides, they say, users who block ads wouldn t have bought the
advertisers products anyway. Many users also object to having data about their
browsing and other behavioral habits tracked by advertising companies. Some
also choose to block ads in hopes of speeding up page load times or reducing
their overall data usage.
What I find remarkable is the way both sides of this debate seem to simply
assume the large-scale capture and exploitation of human attention to be
ethical and/or inevitable in the first place. This demonstrates how utterly we
have all failed to understand the role of attention in the digital age as well
as the implications of spending most of our lives in an environment designed to
compete for it.
In the 1970 s, Herbert Simon pointed out that when information becomes
abundant, attention becomes the scarce resource. In the digital age, we re
living through the pendulum swing of that reversal yet we consistently overlook
its implications.
Think about it: the attention you re deploying in order to read this sentence
right now (an attention for which, by the way, I am grateful) an attention that
includes, among other things, the saccades of your eyeballs, the information
flows of your executive control function, your daily stockpile of willpower,
and the goals you hope reading this blog post will help you achieve these and
other processes you use to navigate your life are literally the object of
competition among most of the technologies you use every day. There are
literally billions of dollars being spent to figure out how to get you to look
at one thing over another; to buy one thing over another; to care about one
thing over another. This is the way we are now monetizing most of the
information in the world.
The large-scale effort that has emerged to capture and exploit your attention
as efficiently as possible is often referred to as the attention economy. In
the attention economy, winning means getting as many people as possible to
spend as much time and attention as possible with your product or service.
(Although, as it s often said, in the attention economy the user is the
product. ) Because there s so much competition for people s attention, this
inevitably means you have to appeal to the impulsive parts of people s brains
and exploit the catalog of irrational biases that psychologists and behavioral
economists have been diligently compiling over the last few decades. (In fact,
there s a burgeoning industry of authors and consultants helping designers draw
on the latest research in behavioral science to exploit these vulnerabilities
as effectively and as reliably as possible.)
We experience the externalities of the attention economy in little drips, so we
tend to describe them with words of mild bemusement like annoying or
distracting. But this is a grave misreading of their nature. In the short
term, distractions can keep us from doing the things we want to do. In the
longer term, however, they can accumulate and keep us from living the lives we
want to live, or, even worse, undermine our capacities for reflection and
self-regulation, making it harder, in the words of Harry Frankfurt, to want
what we want to want. Thus there are deep ethical implications lurking here
for freedom, wellbeing, and even the integrity of the self.
https://adage.com/article/digitalnext/
ad-blocking-unnecessary-internet-apocalypse/300470
Design ethics in the digital age has almost totally focused on how technologies
manage our information think privacy, surveillance, censorship, etc. largely
because our conceptual tool sets emerged in environments where information was
the scarce and valuable thing. But far less analysis has focused on the way our
technologies manage our attention, and it s long past time to forge new ethical
tools for this brave new world.
It s important to note that the essential question here is not whether we as
users are being manipulated by design. That is precisely what design is. The
question is whether or not the design is on our side.
Think about the websites, apps, or communications platforms you use most. What
behavioral metric do you think they re trying to maximize in their design of
your attentional environment? I mean, what do you think is actually on the
dashboards in their weekly product design meetings?
Whatever metric you think they re nudging you toward how do you know? Wouldn t
you like to know? Why shouldn t you know? Isn t there an entire realm of
transparency and corporate responsibility going undemanded here?
I ll give you a hint, though: it s probably not any of the goals you have for
yourself. Your goals are things like spend more time with the kids, learn to
play the zither, lose twenty pounds by summer, finish my degree, etc. Your
time is scarce, and you know it.
Your technologies, on the other hand, are trying to maximize goals like Time
on Site, Number of Video Views, Number of Pageviews, and so on. Hence
clickbait, hence auto-playing videos, hence avalanches of notifications. Your
time is scarce, and your technologies know it.
But these design goals are petty and perverse. They don t recognize our
humanity because they don t bother to ask about it in the first place. In fact,
these goals often clash with the mission statements and marketing claims that
technology companies craft for themselves.
These petty and perverse goals exist largely because they serve the goals of
advertising. Most advertising incentivizes design that optimizes for our
attention rather than our intentions. (Where advertising does respect & support
user intent, it s arguable whether advertising is even the right thing to
call it.) And because digital interfaces are far more malleable (by virtue of
their basis in software) than traditional media such as TV and radio ever
were, digital environments can be bent more fully to the design logic of
advertising. Before software, advertising was always the exception to the rule
but now, in the digital world, advertising has become the rule.
I often hear people say, I use AdBlock, so the ads don t affect me at all.
How head-smackingly wrong they are. (I know, because I used to say this
myself.) If you use products and services whose fundamental design logic is
rooted in maximizing advertising performance that is to say, in getting you to
spend as much of your precious time and attention using the product as possible
then even if you don t see the ads, you still see the ad for the ad (i.e. the
product itself). You still get design that exploits your non-rational
psychological biases in ways that work against you. You still get the flypaper
even if you don t get the swatter. A product or service does not magically
redesign itself around your goals just because you block it from reaching its
own.
So if you wanted to cast a vote against the attention economy, how would you do
it?
There is no paid version of Facebook. Most websites don t give you the option
to pay them directly. Meaningful governmental regulation is unlikely. And the
attention economy can t fix itself: players in the ecosystem don t even
measure the things they d need to measure in order to monetize our intentions
rather than our attention. Ultimately, the ethical challenge of the attention
economy is not one of individual actors but rather the system as a whole (a
perspective Luciano Floridi has termed infraethics ).
https://pubmed.ncbi.nlm.nih.gov/23197312/
In reality, ad blockers are one of the few tools that we as users have if we
want to push back against the perverse design logic that has cannibalized the
soul of the Web.
If enough of us used ad blockers, it could help force a systemic shift away
from the attention economy altogether and the ultimate benefit to our lives
would not just be better ads. It would be better products: better
informational environments that are fundamentally designed to be on our side,
to respect our increasingly scarce attention, and to help us navigate under the
stars of our own goals and values. Isn t that what technology is for?
Given all this, the question should not be whether ad blocking is ethical, but
whether it is a moral obligation. The burden of proof falls squarely on
advertising to justify its intrusions into users attentional spaces not on
users to justify exercising their freedom of attention.