When I liked the Target page, a little widget popped up immediately and prompted me to like 10 other pages, which I did. From there, I engaged only with pages and groups and posts that Facebook curated for me, in all of its data-hoovering and look-alike-audience-building wisdom. Then, my editor and I decided on a list of “likes” that might reflect the tastes of a thoroughly nonpartisan, general-interest American: the Rolling Stones, Grey’s Anatomy, Domino’s Pizza, Target, Oprah, wine. I uploaded a real picture of myself, and added my real hometown as my location. Like the fake Ryan Broderick, and the imaginary Carol Smith and Karen Jones, I would not send or accept any friend requests. I decided to make a new account on the platform as an alternative, apolitical version of myself who enjoys only the most widely beloved things in life. Would your account get pulled into some other kind of rabbit hole? And if it did, what would be waiting there?įor two weeks, I’ve been conducting my own Facebook experiment. Let’s say you made yourself as bland and centrist as you possibly could be, and then let the system do its algorithmic work. Let’s say you never gave the platform any hint about your ideology, or how you’ve ever voted, or whether you even have.
Taken all together, they show how Facebook’s mechanics, left unchecked, can grab ahold of even the slightest political leaning and bend it to grotesque extremes.īut none of these experiments has that much to say about what might happen to a Facebook user who doesn’t care about politics at all. Facebook’s fake accounts started out by liking Fox News and Donald Trump, or else Elizabeth Warren and MoveOn the one created for BuzzFeed went with the Republican National Committee and then–White House Chief of Staff Reince Priebus, as well as Hillary Clinton and Barack Obama. In 2017, the reporter Ryan Broderick published a bloggy version of the same idea at BuzzFeed News: “I Made a Facebook Profile, Started Liking Right-Wing Pages, and Radicalized My News Feed in Four Days.” When that piece came out, Facebook responded, “This isn’t an experiment it’s a stunt.” Now we know that Broderick’s stunt produced, if nothing else, a replicable result.Ĭarol’s journey, like Karen’s and Broderick’s, addressed specific, urgent questions about how Facebook might polarize and confuse American voters. But the mere existence of the rabbit hole wasn’t shocking in itself. The details of this experiment were found among the thousands of documents shared with reporters last month by the whistleblower and former Facebook employee Frances Haugen “ Carol’s Journey to QAnon,” in particular, has featured heavily in coverage. Not a huge surprise: It took less than a week for Carol to be pushed toward online communities dedicated to QAnon, and for Karen to be swamped by lewd anti-Trump material. The employee set up a pair of fake profiles-for Trump-supporting “Carol Smith” and Bernie-loving “Karen Jones”-and then led each one down the path of least resistance, liking whichever groups and pages Facebook’s recommendation system served up.
In 2019, a researcher at Facebook conducted an experiment to see whether the platform really has a tendency to send users down a rabbit hole of extreme and conspiratorial content.