https://www.newyorker.com/culture/infinite-scroll/the-age-of-algorithmic-anxiety
Late last year, Valerie Peter, a twenty-three-year-old student in Manchester, England, realized that she had an online-shopping problem. It was more about what she was buying than how much. A fashion trend of fuzzy leg warmers had infiltrated Peterâs social-media feedsâher TikTok For You tab, her Instagram Explore page, her Pinterest recommendations. Sheâd always considered leg warmers âugly, hideous, ridiculous,â she told me recently, and yet soon enough she âsomehow magically ended up with a pair of them,â which she bought online at the push of a button, on an almost subconscious whim. (She wore them only a few times. âTheyâre in the back of my closet,â she said.) The same thing later happened with Van Cleef & Arpels jewelry, after a cast member on the U.K. reality show âLove Islandâ wore a necklace from the brand onscreen. Van Cleefâs Art Nouveau-ish flower bracelets made their way onto Peterâs TikTok feed, and she found herself browsing the brandâs products. The bombardment made her question: âIs this me? Is this my style?â she said.
In her confusion, Peter wrote an e-mail seeking advice from Rachel Tashjian, a fashion critic who writes a popular newsletter called âOpulent Tips.â âIâve been on the internet for the last 10 years and I donât know if I like what I like or what an algorithm wants me to like,â Peter wrote. Sheâd come to see social networksâ algorithmic recommendations as a kind of psychic intrusion, surreptitiously reshaping what sheâs shown online and, thus, her understanding of her own inclinations and tastes. âI want things I truly like not what is being lowkey marketed to me,â her letter continued.
Sign up for the New Yorker Recommends newsletter.
What our staff is reading, watching, and listening to each week.
E-mail address
By signing up, you agree to our User Agreement and Privacy Policy & Cookie Statement.
Of course, consumers have always been the targets of manipulative advertising. A ubiquitous billboard ad or TV commercial can worm its way into your brain, making you think you need to buy, say, a new piece of video-enabled exercise equipment immediately. But social networks have always purported to show us things that we likeâthings that we might have organically gravitated to ourselves. Why, then, can it feel as though the entire ecosystem of content that we interact with online has been engineered to influence us in ways that we canât quite parse, and that have only a distant relationship to our own authentic preferences? No one brand was promoting leg warmers to Peter. No single piece of sponcon was responsible for selling her Van Cleef jewelry. Rather, âthe algorithmââthat vague, shadowy, inhuman entity she referenced in her e-mailâhad decided that leg warmers and jewelry were what she was going to see.
Peterâs dilemma brought to my mind a term that has been used, in recent years, to describe the modern Internet userâs feeling that she must constantly contend with machine estimations of her desires: algorithmic anxiety. Besieged by automated recommendations, we are left to guess exactly how they are influencing us, feeling in some moments misperceived or misled and in other moments clocked with eerie precision. At times, the computer sometimes seems more in control of our choices than we are.
An algorithm, in mathematics, is simply a set of steps used to perform a calculation, whether itâs the formula for the area of a triangle or the lines of a complex proof. But when we talk about algorithms online weâre usually referring to what developers call ârecommender systems,â which have been employed since the advent of personal computing to help users index and sort floods of digital content. In 1992, engineers at Xeroxâs Palo Alto Research Center built an algorithmic system called Tapestry to rate incoming e-mails by relevance, using factors such as who else had opened a message and how theyâd reacted to it (a.k.a. âcollaborative filteringâ). Two years later, researchers at the M.I.T. Media Lab built Ringo, a music-recommendation system that worked by comparing usersâ tastes with others who liked similar musicians. (They called it âsocial-information filtering.â) Googleâs original search tool, from 1998, was driven by PageRank, an early algorithm for measuring the relative importance of a Web page.
Only in the middle of the past decade, though, did recommender systems become a pervasive part of life online. Facebook, Twitter, and Instagram all shifted away from chronological feedsâshowing messages in the order in which they were postedâtoward more algorithmically sequenced ones, displaying what the platforms determined would be most engaging to the user. Spotify and Netflix introduced personalized interfaces that sought to cater to each userâs tastes. (Top Picks for Kyle!) Such changes made platforms feel less predictable and less transparent. What you saw was never quite the same as what anyone else was seeing. You couldnât count on a feed to work the same way from one month to the next. Just last week, Facebook implemented a new default Home tab on its app that prioritizes recommended content in the vein of TikTok, its main competitor.
Almost every other major Internet platform makes use of some form of algorithmic recommendation. Google Maps calculates driving routes using unspecified variables, including predicted traffic patterns and fuel efficiency, rerouting us mid-journey in ways that may be more convenient or may lead us astray. The food-delivery app Seamless front-loads menu items that it predicts you might like based on your recent ordering habits, the time of day, and what is âpopular near you.â E-mail and text-message systems supply predictions for what youâre about to type. (âGot it!â) It can feel as though every app is trying to guess what you want before your brain has time to come up with its own answer, like an obnoxious party guest who finishes your sentences as you speak them. We are constantly negotiating with the pesky figure of the algorithm, unsure how we would have behaved if weâd been left to our own devices. No wonder we are made anxious. In a recent essay for Pitchfork, Jeremy D. Larson described a nagging feeling that Spotifyâs algorithmic recommendations and automated playlists were draining the joy from listening to music by short-circuiting the process of organic discovery: âEven though it has all the music Iâve ever wanted, none of it feels necessarily rewarding, emotional, or personal.â
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
Scholars have come up with various terms to define our fitful relationship with algorithmic technology. In a 2017 paper, Taina Bucher, a professor at the University of Oslo, collected aggrieved tweets about Facebookâs feed as a record of what she called an emerging âalgorithmic imaginary.â One user wondered why her searches for a baby-shower gift had seemingly prompted ads for pregnancy-tracking apps. A musician was frustrated that his posts sharing new songs were getting little attention, despite his best attempts to optimize for promotion by, say, including exclamatory phrases such as âWow!â There was a âstructure of feelingâ developing around the algorithm, Bucher told me, adding, âPeople were noticing that there was something about these systems that had an impact on their lives.â Around the same time, Tarleton Gillespie, an academic who works for Microsoftâs research subsidiary, described how users were learning to shape what they posted to maximize their âalgorithmic recognizability,â an effort that he compared to a speaker âturning toward the microphoneâ to amplify her voice. Content lived or died by S.E.O., or search-engine optimization, and those who learned to exploit its rules acquired special powers. Gillespie cites, as an example, when the advice columnist Dan Savage mounted a successful campaign, in 2003, to overwhelm the Google search results for Rick Santorum, the right-wing senator, with a vulgar sexual neologism.
âAlgorithmic anxiety,â however, is the most apt phrase Iâve found for describing the unsettling experience of navigating todayâs online platforms. Shagun Jhaver, a scholar of social computing, helped define the phrase while conducting research and interviews in collaboration with Airbnb in 2018. Of fifteen hosts he spoke to, most worried about where their listings were appearing in usersâ search results. They felt âuncertainty about how Airbnb algorithms work and a perceived lack of control,â Jhaver reported in a paper co-written with two Airbnb employees. One host told Jhaver, âLots of listings that are worse than mine are in higher positions.â On top of trying to boost their rankings by repainting walls, replacing furniture, or taking more flattering photos, the hosts also developed what Jhaver called âfolk theoriesâ about how the algorithm worked. They would log on to Airbnb repeatedly throughout the day or constantly update their unitâs availability, suspecting that doing so would help get them noticed by the algorithm. Some inaccurately marked their listings as âchild safe,â in the belief that it would give them a bump. (According to Jhaver, Airbnb couldnât confirm that it had any effect.) Jhaver came to see the Airbnb hosts as workers being overseen by a computer overlord instead of human managers. In order to make a living, they had to guess what their capricious boss wanted, and the anxious guesswork may have made the system less efficient over all.
The Airbnb hostsâ concerns were rooted in the challenges of selling a product online, but Iâm most interested in the similar feelings that plague those, like Valerie Peter, who are trying to figure out what to consume. To that end, I recently sent out a survey about algorithms to my online friends and followers; the responses I received, from more than a hundred people, formed a catalogue of algorithmic anxieties. Answering a question about âodd run-insâ with automated recommendations, one user reported that, after he became single, Instagram began recommending the accounts of models, and another had been mystified to see the Soundgarden song âBlack Hole Sunâ pop up on every platform at once. Many complained that algorithmic recommendations seemed to crudely simplify their tastes, offering âworse versions of things I like that have certain superficial similarities,â as one person put it. All but five answered âyesâ to the question, âHas âthe algorithm,â or algorithmic feeds, taken up more of your online experience over the years?â One wrote that the problem had become so pervasive that theyâd âstopped caring,â but only because they âdidnât want to live with anxiety.â
Patricia de Vries, a research professor at Gerrit Rietveld Academie who has written about algorithmic anxiety, told me, âJust as the fear of heights is not about heights, algorithmic anxiety is not simply about algorithms.â Algorithms would not have the power they have without the floods of data that we voluntarily produce on sites that exploit our identities and preferences for profit. When an ad for bras or mattresses follows us around the Internet, the culprit is not just the recommendation algorithm but the entire business model of ad-based social media that billions of people participate in every day. When we talk about âthe algorithm,â we might be conflating recommender systems with online surveillance, monopolization, and the digital platformsâ takeover of all of our leisure timeâin other words, with the entire extractive technology industry of the twenty-first century. Bucher told me that the idea of the algorithm is âa proxy for technology, and peopleâs relationships to the machine.â It has become a metaphor for the ultimate digital Other, a representation of all of our uneasiness with online life.
Users canât be blamed for misunderstanding the limits of algorithms, because tech companies have gone out of their way to keep their systems opaque, both to manage user behavior and to prevent trade secrets from being leaked to competitors or co-opted by bots. Krishna Gade took a job at Facebook just after the 2016 election, working to improve news-feed quality. While there, he developed a feature, called âWhy am I seeing this post?,â that allowed a user to click a button on any item that appeared in her Facebook feed and see some of the algorithmic variables that had caused the item to appear. A dog photo might be in her feed, for example, because she âcommented on posts with photos more than other media typesâ and because she belonged to a group called Woofers & Puppers. Gade told me that he saw the feature as fostering a sense of transparency and trust. âI think users should be given the rights to ask for whatâs going on,â he said. At the least, it offered users a striking glimpse of how the recommender system perceived them. Yet today, on Facebookâs Web site, the âWhy am I seeing this post?â button is available only for ads. On the app itâs included for non-ad posts, too, but, when I tried it recently on a handful of posts, most said only that they were âpopular compared to other posts youâve seen.â
In the absence of reliable transparency, many of us have devised home remedies for managing the algorithmâs influence. Like the Airbnb hosts, we adopt hacks that we hope might garner us promotion on social media, like a brief trend, some years ago, of users prefacing their Facebook posts with fake engagement or wedding announcements. We try to teach recommender systems our preferences by thumbs-downing films we donât like on Netflix or flipping quickly past unwanted TikTok videos. It doesnât always work. Valerie Peter recalled that, after she followed a bunch of astrology-focussed accounts on Twitter, her feed began recommending a deluge of astrological content. Her interest in the subject quickly fadedââI began fearing for my life every time Mercury was in retrograde,â she saidâbut Twitter kept pushing related content. The site has a button that users can hit to signal that they are âNot interested in this Tweet,â appended with a sad-face emoji, but when Peter tried it she found that Twitterâs suggested alternatives were astrology-related, too. âIâve been trying for a month or two now, but I keep seeing them,â she said. The algorithm gathers information and silently makes decisions for us, but offers little opportunity to communicate back. In the midst of my work on this piece, Gmailâs sorting algorithm decided that an e-mail of fact-checking materials Iâd sent to my editor was spam and disappeared it from my âSentâ folder, something Iâd never previously experienced and would prefer not to have happen again.
Lately, I have been drawn toward corners of the Internet that are not governed by algorithmic recommendations. I signed up for Glass, a photo-sharing app that caters to professional photographers but is open to anyone. My feed there is quiet, pristine, and entirely chronological, featuring mostly black-and-white city snapshots and wide color landscapes, a mix reminiscent of the early days of Flickr (even if the predominant aesthetic of photography today has been shaped by iPhone-camera-optimization algorithms). I canât imagine having such a pleasant experience these days on Instagram, where my feed has been overtaken by irritating recommended videos as the platform attempts to mimic TikTok. (Why does the algorithm think I like watching motorcycle stunts?) The only problem with Glass is that there isnât enough content for me to see, because my friends havenât joined yet. The gravitational pull of the major social networks is hard to overcome. Since Twitter did away with the desktop version of TweetDeck, which I had used to access a chronological version of my feed, Iâve been relying more on Discord, where my friends gather in chat rooms to swap personal recommendations and news items. But the reality is that much of what I encounter on Discord has been curated from the feeds of traditional platforms. These new spaces on the Internet are a buffer to the influence of algorithms, not a blockade.
In Tashjianâs newsletter, she advised Peter to explore her own tastes outside of social-media feeds. âYou have to adopt a rabbithole mentality! Read the footnotes and let one footnote lead to another,â Tashjian wrote. Maybe you find a film that you like, she suggested, and watch all of that directorâs other films. Maybe you discover that you want a nightgown and âfind a pretty good imitationâ of a great one on Etsy. Of course, so many exploratory paths through culture are mediated by algorithms, too. When I went to Etsyâs home page the other day, I was greeted with a display of automatically generated recommendations labelled âNew items our editors love.â Perhaps owing to some quirk of my Internet browsing history, these included tote bags with German-language slogans and monogrammed travel mugs. Is there a human curator out there who actually loves these things? When they start popping up in my Instagram feed, will I learn to love them, too? Youâd think the algorithm would know me better by now. âŠ
New Yorker Favorites
- Queen Elizabeth IIâs fine-tuned feelings.
- After Muhammad Ali, Richard Pryor was the baddest person anywhere.
- John and Yoko take Manhattan.
- Edith Piafâs thousand (delightful) ways to bum you out.
- Searching for signs of Oprah in O magazine.
- Hattie McDaniel arrives at the Coconut Grove.
- Fiction by Miranda July: âRoy Spivey.â
Sign up for our daily newsletter to receive the best stories from The New Yorker.