The Attention Economy

In the extremely connected world we live in, where every status update and every tweet are a way for us to express our opinions, the social platforms have become the digital equivalents of town squares, connecting us with our families, friends and like-minded people.

But what happens when the very platforms we count on turn against us, where bad actors exploit them to manipulate news, set agendas, and propagate toxic viewpoints? What happens when select groups hijack users' attention to spew conspiracy theories, Islamophobia, misogyny, and white supremacy ideologies with an intent to radicalise impressionable minds?

Attention feeds addiction (Image: Mozilla)

The public feeds, which incentivised us into sharing us more, and connect and engage with as many people as possible, have also, of late, emerged the biggest source of headaches for the companies running them. Whether be it Facebook or Instagram, or Twitter, or YouTube, problems surrounding extremist content, moderation and general misuse run rampant.

While one may argue that these problems are complex enough to merit any tangible solutions, the platforms have also unfortunately worked to their advantage, with the companies using these addictive services as a conduit to boost traffic and drive engagement.

These largely automated platforms — by design, business model and algorithm — have made it easy for it to be weaponised to spread misinformation and fraudulent content. Sadly, this business model has also proven to be lucrative. After all, from a business sense, it's hard to take down a page or a Facebook page or a YouTube channel that's driving a lot of traffic.

Their algorithms learn what information attracts attention, then in turn promote that content to exactly the users who are most likely to consume and share it, which in turn gives that content more attention, and generates more profit. As a society, we communicate with each other using products that aren't designed to help us communicate — they're designed to keep us watching.

The flaws in this algorithmic amplification system are baked into its design. They have been repeatedly exploited, and we're still vulnerable. We're vulnerable to content that grabs our attention, regardless of its veracity or substance. We're vulnerable to targeting systems that understand our preferences, our beliefs, and our politics, and can show attention-grabbing, influential content to exactly the people most likely to be manipulated by it.

And we're vulnerable to automation — fake accounts that invent, share, and amplify information to fool tech company algorithms into believing information will be valuable in capturing their users' attention. The gamification of the web has transformed the attention economy into an addiction economy that rewards apps and services that we give our attention to. The way forward, then, is to responsibly create algorithmic-yet-rewarding experiences that don't abuse our attention.

Comments