Online Media Dystopia
Concerned about online misinformation and fake news, I made a few revisions to the syllabi for my Introduction to Sociology courses before the start of the semester this past fall. I created an information literacy assignment based on the ongoing debate about the “marshmallow test.” But, I also made space to discuss Zeynep Tufekci’s research, particularly her analyses of how digital platforms and their algorithms shape how we collect information, share ideas, and interact with each other. Many students responded enthusiastically to these topics. And, while most were not surprised by the various concerning issues that Tufekci raises about digital platforms, many did report that understanding her research was causing them to reconsider the ways in which they engage online.
Zeynep Tufekci is an Associate Professor at the University of North Carolina in the School of Information and Library Science with an affiliate position in UNC’s Department of Sociology. Her book Twitter and Tear Gas, provides a vivid analysis of the ways in which social media supported social movements including the Arab Spring and the Occupy Movement, while also describing the challenges created by these same platforms.
In the New York Times Opinion page, Tufekci describes how YouTube’s algorithms present increasingly more extreme content to viewers, and warns that the platform has become “the Great Radicalizer.” Whether the viewer is watching political speeches or seemingly less controversial topics, the system presents viewers with more intense content as part of the company’s effort to hold viewer’s attention and keep users engaged with the platform.
In her TED Talk, Tufekci describes watching a video about vegetarianism and being presented with a video about veganism, and quips, “It's like you're never hardcore enough for YouTube.” Tufekci participated in an informative discussion on the Ezra Klein Show podcast about why online politics become extreme so quickly, and other issues including the imperative for online platforms of grabbing and holding viewer attention.
At a time when Facebook is struggling with several scandals, Tufekci also wrote about four legislative strategies to protect ourselves from Facebook. Among them, she argues that personalized data collection by digital platforms could be limited by requiring clear, concise, and transparent opt-in mechanisms. She also contends that while companies say that users “own their data,” the aggregate use of data should be regulated.
How did students react to Tufekci’s analyses?
Several students wrote about how they had also noticed that ads for particular goods would “follow” them around the internet, through various pages and platforms, and that the suggested videos they were shown often become increasingly extreme. One student explained that they had wondered how these platforms made money, since “it’s a free app.” In realizing that the app is free because their attention is the product being sold (by platforms to their advertisers), many students suggested they were reconsidering how they engaged with the apps and how they thought about the time they spent on these platforms.
Most students were not surprised about the concerns raised, but they were deeply interested to learn more about the behind-the-scenes mechanisms through which these platforms operate and structure social interaction. One student quoted an insightful comment from one of Tufekci’s editorials, in which she wrote “our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes, and misinformation.”
Another student explained that they regularly spend time watching videos on YouTube and had also noticed the pattern of increasingly extreme suggestions. This student wrote, “I agree that this is an issue because these extreme video recommendations essentially glue people to YouTube making them watch videos for hours and hours.” Many students discussed how after learning they were being presented increasingly extreme content as a result of the algorithms attempting to hold their attention longer, they were more skeptical of just how informative the suggested content was. An especially frustrated student titled their reaction paper “Put a Stop to Youtube,” and about Tufekci’s research the student wrote, “I hope that her findings spread every where.”
Of course, completely shutting down any of these platforms would create lots of other problems. Also, the services these platforms provide are often beneficial. For example, I follow lots of other sociologists on Twitter, and have been inspired by new ideas they have shared and even found support and camaraderie among other first-generation and working class academics.
But many students reacted to Tufekci’s research with the sense that these analyses provided them a toolkit with which they could more effectively dismantle the aspects of these platforms that they found particularly problematic. Many students concurred with Tufekci that reasonable and effective regulation of the operation of these platforms could provide improvements. Still, students reckoned with how much time and attention they were giving to these platforms.
Combined with Tufekci’s analysis, many students expressed a deeper sense of the relative dominance of these platforms. In reaction to the ways in which these influential companies have worked to bolster their statuses in the face of scandals, many students invoked an informal description of the conflict perspective we used in class, that “people with power use their power to maintain their power.”
How do you think about the ways in which you engage with these platforms? Are you reconsidering how you use them?
If you’re looking for more information, Zeynep Tufekci is on Twitter. If you don’t want to wait for the algorithm to decide when you’ll see her tweets, you can go directly to her page here. You can also read the transcript of her interview with Frontline’s James Jacoby here. And remember, if you have a sociological question, your best strategy for gathering quality information on the topic is to go read the research.
I think it's interesting to see the progression of psychology applied to marketing over the decades. It has only become more manipulative over time and I'm sure it won't cease to do so. Learning about the algorithms used on YouTube feels empowering. It gives me a launchpad from which to search out more information. I am trying to be manipulated as a consumer, but I think an inquisitive approach turns fear and shock into an energy that works for me, not for them.
Posted by: Jaqi123 | January 24, 2019 at 04:09 PM
Great Article, Thank you.
Posted by: السياحة في تركيا | February 12, 2019 at 08:09 AM
Thank you
Posted by: Talentsroot | May 21, 2019 at 05:23 AM
Very interesting.
Posted by: Catherine Mbinya | June 30, 2019 at 06:51 AM
Well that is how the internet nowadays all about. Thanks for the great article.
Posted by: محمد السواح | August 06, 2020 at 01:00 PM