Human Downgrading
Human Downgrading
  • Reporter Kim San
  • 승인 2022.05.02 22:59
  • 댓글 0
이 기사를 공유합니다

▲Under the hood of the social media platform is a behavior modification program / The Almanac
▲Under the hood of the social media platform is a behavior modification program / The Almanac

Approximately 4 billion users access information on the Internet through Google; Facebook has 2.89 billion active users, and YouTube has 2 billion. These Tech Giants wield the divine power to shape social relationships, sway societal norms and, most importantly, control what is believed to be true: the fabric of reality upon which democratic society functions. “Persuasive technology is a massively underestimated and powerful force shaping the world because it shapes where two billion people place their attention on a daily basis,” testified Tristan Harris, an American technology ethicist and the co-founder of the Center for Humane Technology, at the 2019 Senate Commerce Committee. Between these Goliath-like figures and the helpless users is a huge asymmetry of power that imposes a potentially existential threat to both the individual and society. The implication and complexity of the problem such as, but not only, disinformation, polarization, outrage-ification, and shortening attention span is unfathomably far-reaching and profound that it is rightfully described as the “climate change of culture”.
To put it bluntly, the current model for social media is unsustainable – even unethical. When the only way to communicate with people on the internet is through a third party whose goal is to spy on users and manipulate their behavior to maximize ad revenue, there is an issue. And its cause is largely linked with the companies’ ill-suited advertisement business model that generates profit by capturing users’ attention, getting them exposed to as many ads as possible. However, there can only be so much attention that could be extracted, creating a “race to the bottom of the brainstem” that, consequently, lead to a more addictive design. When YouTube, for example, adds a new feature – such as auto-play in 5 seconds – to capture more user attention, it is effectively eating up the attention-share of its competitor Netflix. Soon, Netflix follows suit by adding the same auto-play feature into their design. In describing this situation, Tristan coins the phrase “fracking for attention” as if the current competition is like finding more effective methods and locations for fracking hidden underground oil reserves. At the end of the day, the competition boils down to making persuasive design choices that go deeper into the very primitive of human emotion. The prevailing negativity on the social media platform can be ascribed to exploiting the human’s fight-or-flight response; the ‘pull to refresh’ feature resembles that of a casino slot machine; the ‘infinite scroll’ eliminates the sense of time a user has spent scrolling through the feed; the “like” and “follow” feature redefines the ways in which people perceive social validation. Furthermore, “the race went deeper into persuading our very identity: photo-sharing apps that include beautification filters work better at capturing attention than apps that do not, fueling Body Dysmorphic Disorder, social anxiety, and depression,” according to the Center for Humane Technology. 
The American computer scientist and philosophy writer Jaron Lanier attributes social media’s recommender system to a behavior conditioning mechanism. The recommendation of the content on social media is made based on constant measurement of the users’ engagement, which is a simple metric that tells the effectiveness of content in capturing users’ attention. Concretely, if a specific content shows a high engagement, then the alike-contents are further recommended down the scroll. The engagement metric is determined by measuring the duration of time a user stayed before scrolling down, or whether a user has liked, posted a comment, or shared the content. This is problematic, however, because those measurements only catch short-term engagement, in effect incentivizing sensational and provocative content that elicits an immediate emotional response while discouraging wholesome and positive content. As a famous study done by three MIT scholars showed, fake news, sensational by nature, “spreads six times faster than real news because it is free to evolve to confirm existing beliefs, unlike real news which is constrained by what is true.” The fact that the contents are recommended based on their engagement rather than their factual accuracy exacerbates disinformation and “debases the information environment that powers our democracy,” according to Tristan’s testimony to Congress.

▲Tristan Harris on a talk about Human Downgrading / Axios
▲Tristan Harris on a talk about Human Downgrading / Axios