It's been four years since the real backlash to Facebook began. For years, the social network had enjoyed a relatively unobstructed rise from dorm-room project to global juggernaut, viewed largely as a tool that connected people from all walks of life and democratized speech, all for the low price of nothing. Sure, plenty of vocal privacy advocates and media watchdogs had decried the company's invasive tracking procedures, and, along with Google, its complete domination of the online advertising market. But to the general public, Facebook was an innocuous tool for staying in touch with friends.
But on the night of November 8, 2016, as the networks began calling more states for Donald Trump, millions of Americans began looking for something to blame. Was it economic anxiety that handed Trump the presidency? Was it racists, citing the nebulous excuse of "economic anxiety?? Was it the inherently unbalanced electoral college system, which makes certain votes count more than others? Was it Hillary Clinton's substantial unpopularity? Was it her emails? Was it FBI Director James Comey? Was it Wikileaks? Guccifer 2.0? Russia? Maybe it was the fault of Jill Stein, or the Bernie Bros.
Or was it Facebook?
In 2016, Donald Trump's campaign bought a shitload of Facebook ads, which his operation has repeated in 2020. As Facebook executive Andrew Bosworth later put it, Donald Trump "ran the single best digital ad campaign I’ve ever seen from any advertiser. Period." Outside of the official Trump campaign, other forces were leveraging the enormous power of Facebook to disseminate news articles, regardless of whether or not the events described had actually happened. In one infamous example, Facebook let millions of users know that Pope Francis had endorsed Trump (he had not). The large ecosystem of fake news, hoaxes, hyperbole, conspiracy theorizing, and reckless speculation on Facebook had created the perfect conditions in which a media-savvy, factually challenged personality like Trump could thrive. He was, after all, no stranger to the "just asking questions" mode of laundering conspiracies—he'd spent years pushing unfounded birther claims about Barack Obama.
By the morning of November 9, the day after the election, when Trump's victory was clear and incontestable, the blame was already being thrown around. In a viral piece for New York magazine (where, full disclosure, I worked at the time), Max Read stated plainly that, "Donald Trump Won Because of Facebook."
He wrote:
“At the heart of the problem, anyway, is not the motivations of the hoaxers but the structure of social media itself. Tens of millions of people, invigorated by insurgent outsider candidates and anger at perceived political enemies, were served up or shared emotionally charged news stories about the candidates, because Facebook’s sorting algorithm understood from experience that they were seeking such stories. Many of those stories were lies, or “parodies,” but their appearance and placement in a news feed were no different from those of any publisher with a commitment to, you know, not lying.”
A few days after the election—in remarks that he now certainly regrets because they get cited all the time to illustrate his naivete—Mark Zuckerberg characterized the idea that Facebook was responsible for Trump’s election as “pretty crazy.” It was not that crazy. In short order, it became clear that Facebook had an incalculably large mess on its hands. Facebook’s multifaceted automated systems of sorting, parsing, and recommending content to each individual user had been abused to flood the system with unreliable information and distorted, emotionally slanted news content. Content farms, like ones run by teenagers in Macedonia, expertly worked over Facebook’s algorithmic preferences to build up large, loyal audiences who consumed articles on events that had not happened, or happened exactly as they described. Crucially, there was, and is, a far larger audience for these stories on the right side of the political spectrum than on the left.
It’s easy, and accurate, to lay much of the blame squarely on Facebook. It’s no secret that for years, and arguably still, Facebook’s automated News Feed favored content that garnered high engagement. The emphasis on engagement (i.e., what keeps the user on Facebook, and keeps them looking at ads) was the priority, rather than accuracy or substance, and it had created a toxic content ecosystem, one that we would later find out had been infiltrated by state-sponsored operations in Russia. Facebook has, in subsequent years, made a big show of stamping out “coordinated inauthentic behavior” from countries like Russia and Iran, and removing an infinitesimal fraction of its two billion-plus accounts.
Years before Facebook’s creation, conservative media outlets and pundits laid the groundwork and created the framework that makes Facebook so effective.
Elsewhere, in terms of meaningful platform changes, Facebook has dragged its heels in many respects, stalled by a persistent fear of conservative backlash. The Washington Post reported earlier this year that in the weeks following the election, Facebook launched an effort to root out propaganda on its platform. The effort hit a snag after Facebook’s head honcho in Washington, Joel Kaplan, opposed the effort because it would “disproportionately affect conservatives.” Kaplan, who served as counsel for the second Bush administration, runs Facebook’s Washington D.C. office. This theme comes up a lot in recent reporting on Facebook policy decisions. In May, the Wall Street Journal conveyed that a 2018 internal report found that the platform exacerbated polarization, but efforts to mitigate it had been shelved over concerns, expressed by Kaplan and others, that the efforts would be seen as biased against conservatives.
The fear of a conservative backlash stems from a disastrous 2016 Gizmodo article, whose headline claimed that Facebook—specifically the human editors who curated its now-defunct Trending Topics module—“suppressed” conservative news and hid it from users. The piece described a process that, in pretty much any other context, would be seen as editors exercising editorial judgement: “This story is useful to readers, this one is not.” But framed with a grabby headline stoking a partisan divide (as former Gizmodo Media Group editorial director John Cook later wrote), the story set off an immediate firestorm. Zuckerberg invited prominent conservatives over for dinner to try and mollify them. The specter of regulation and unsubstantiated claims of “anti-conservative bias” have loomed over the company ever since, paralyzing its policy and moderation teams. Zuckerberg has been called to testify in front of Congress about Facebook’s supposed anti-conservative bias numerous times since; this came up, most recently, during Tuesday’s hearings in front of the Senate Commerce Committee.
The fact of the matter is that, even today, despite claims to the contrary, hardcore right-wing content does extremely well on Facebook. Conservative pages regularly compose most of if not all of the list of top performing posts each day (that performance is calculated by CrowdTangle, a social media analytics service that Facebook acquired in 2016). Right-wing pages that have clearly violated Facebook’s policies get special treatment and regularly avoid punishment.
Facebook has no explicit policies, internally or externally, prohibiting right-wing ideas and opinions on its platform and Zuckerberg’s personal politics are somewhat opaque. If Facebook did have a political agenda, the public would surely know about it by now. What the company does have, however, are various policies that prohibit things like dehumanizing speech, news media created with intent to deceive, and certain conspiracy theories such as QAnon. What does it say about Facebook that common-sense policies—intended to ensure the safety and dignity of all of the site’s users—would, according to its own stress tests, disproportionately impact conservative publishers? If politically-neutral measures intended to reduce harm on Facebook impact right-wing publishers and users more than left-wing ones, is that Facebook’s fault? Or does it point to an overarching alignment between Facebook’s incentives and the style and structure of right-wing media?
It is easy and tempting to look at Facebook’s problems in a bubble—a new media platform, with new formats and distribution channels, creating new, thorny problems society has never encountered before. It is easy to blame recommendation algorithms, and artificial intelligence, and dastardly foreign powers, and demand technological solutions. There are obvious changes that Facebook could implement (for years, it dragged its feet on rules banning Holocaust denial) but the handwringing over Facebook, in the media and more broadly, feels largely displaced—one focused on cold, unfeeling technology rather than traditions and rhetoric that stretch back decades. Facebook’s relationship to conservatism does not begin with Mark Zuckerberg creating a ‘Hot Or Not’ clone in 2004. Years before its creation, conservative media outlets and pundits laid the groundwork and created the framework that makes Facebook so effective.
It is tempting to see the conservative dominance of Facebook as savvy operators adapting to new conditions—behavior created by technological conditions. But what if it’s the other way around? What if the same principles that make conservative media what it is are also what make Facebook such a powerful distribution channel? For Facebook to truly reckon with and achieve its stated goal to become a universal tool for the public good, it first needs to acknowledge that the supposedly party-neutral problems of disinformation, conspiracy, and toxicity are actually a partisan issue.
For decades, the right has been creating a media ecosystem that emphasizes distrust in institutions, rewards conspiracy theories, and places an emphasis on authenticity and emotional expression. It is difficult, personally speaking, to be particularly worried about QAnon when I also lived through birtherism a decade ago. Conspiracy theories about the Clinton family have persisted for my entire life. Every decade has its own delusions—the ‘60s had A Texan Looks At Lyndon, a popular, self-published book full of “rumormongering and mad-dog ruminations,” as Texas Monthly put it, including the claim that Johnson was closely involved in the Kennedy assassination. By the time of the ‘64 Democratic National Convention, “at almost 7.5 million copies, A Texan Looks at Lyndon had become the best-selling book of any kind in the country and the most successful political book of all time,” thanks in part to right-wing groups like the John Birch Society.
“There is a kind of tendency towards conspiracism on the right that is stronger than on the left,” Columbia historian and conservative media scholar Nicole Hemmer told me. “And it's fed by a kind of structural issue, which is that, starting in the 1940s and 1950s, conservative leaders are making the argument that you can't trust non-conservative media, that non-conservative media is actually liberal media, and that it’s agenda-driven liberal media that's trying to brainwash you or mislead you or lie to you.” (It’s worth noting that, as historian David Greenberg outlined to me in an email, people on both the right and left become more prone to conspiracy as they move to the extremes. The big difference now is that the mainstream GOP is far closer to its respective extreme end than the mainstream Democratic party.)
“There's always been this way that conservatives from the very beginning, in the McCarthy era, have focused on very small connections or associations and turn them into these big deals,” A.J Bauer, a professor at NYU who specializes in the history of conservatism, says. In the ‘50s, the pamphlet “Red Channels: The Report of Communist Influence in Radio and Television” turned humdrum events and associations, like attending an NAACP event, into damning indictments. “A lot of the information that was in there as quote-unquote damning, it's completely banal, and not damning at all in hindsight,” Bauer says. The conservative ability to make mountains out of molehills, to insinuate wrongdoing and punish people for it without actually proving it, stretches back to the start of the modern conservative movement and has sustained itself well into the present (see also: Hunter Biden’s laptop, Hillary’s emails, Obama’s “hip-hop barbecue”).
In the ‘70s, as CUNY professor and author of Fox Populism Reece Peck explains, Roger Ailes entered the Nixon White House. Nixon had a famously fraught relationship with the press. This is where the right-wing narrative of the educated, liberal elite having captured the government really begins to crystalize, with figures like Pat Buchanan and Kevin Phillips claiming populism as the Republican value. “They were the first to kind of use this narrative that attacks intellectuals. And, it wasn't just a rhetorical theme, but it was also that you had to execute that in your style,” Peck says. “So you weren't just against the elite: you don't act like the elite on television, in the way you talk, in the way that you show your emotion. And in the way that you always frame political issues and personal terms in terms of your biography.” It’s in this period, in the ‘70s and ‘80s, that the persistent Republican philosophies—populism, distrust in the media, the Republican party as authentic and down-to-earth—start to become married to a certain type of presentation on television.
From the 1940s into the 1980s, American journalism was dominated by the “high-modern” paradigm. “The news anchor was viewed and conceptualized as an informational expert, almost like a stenographer in a courtroom. They're just there to report the facts and convey the information in an accessible way for the audience,” Peck says. That began to change in the ‘80s as cable television splintered the viewing audience into partisan camps, and news programs moved away from a public-service model and towards an entertainment model. In the late ‘80s, the FCC under the Reagan Administration repealed what is known as the Fairness Doctrine, which required political programming to present both sides of an issue. Partisan media exploded, especially in the world of talk radio. On television, the bulk of this tectonic shift is attributable to one man: Rupert Murdoch, the head of News Corp. The Australian media magnate, Peck says, “recognized that there was a market, or an untapped desire, for news with more of a popular style that was not performing this kind of middlebrow, genteel disposition of the anchor. They wanted people that were seemingly—the big watchword is ‘authentic’—that kind of lower their guard, that let their hair down, that display emotion.”
In the ‘80s, being anti-elite grew from just a stance into an aesthetic and a presentation style. Even before Fox News came into being, Murdoch applied this principle of creating news content with a strong emotional component to television shows like A Current Affair and Inside Edition, where Bill O’Reilly got his start. This type of show excelled in developing what sociologist Laura Grindstaff calls “the money shot”—the reaction to a paternity test on Maury, a thrown chair on Jerry Springer, a sobbing guest on Ricki Lake. The key to television that got good ratings, according to Peck, was “the producers that understood how to create conditions for an emotional outburst.” By the time Murdoch launched Fox News in 1996, the public was conditioned for the tabloidization of television news.
When Fox News launched in the mid ‘90s, it embraced this attitude of emotion and authenticity so thoroughly that now, according to Peck, “we conflate that style with conservatism.” The style played directly into claims that the liberal media was for snobby intellectuals and conservative was for regular people. “Shep Smith was from A Current Affair, Bill O'Reilly was from Inside Edition, Hannity was a talk radio host from Atlanta. [Murdoch and Roger Ailes] purposely picked people that looked unpolished or that would go against the norms of TV news, aesthetically and stylistically,” Peck recounts. “It just fit into this whole narrative that if you're a white working class guy that feels like you're an underdog, and you're embattled, and people look down at you, you must be conservative.”
The only type of content that seems to survive each algorithm shift is conservative media
Around the same time, in the late ‘80s, following the elimination of the Fairness doctrine, Rush Limbaugh skyrocketed to fame. “Rush Limbaugh doesn't write up a script, and then just reads it straight up. He might have an outline of what he wants to talk about that day, but he riffs. And then he'll get callers that call in and ask him questions, and he'll banter back and forth with them,” Bauer notes. “There's something about that banter, and that being able to speak off the cuff that lends itself to a sense of authenticity, of credibility, that is rooted in the values and nature of the person itself. Not in that person's knowledge or ability to be an expert in something.” In other words, the mere act of being able to speak about these issues at all, at length, regardless of accuracy, is seen as a strength of right-wing hosts like Limbaugh.
Now, take a second to look at the vocabulary we’ve been using to discuss the changes in the news media over the second half of the 20th century. As conservative media grew, to get around “gatekeepers,” it brought with it content that was “emotional” and “authentic.” News anchors brought their “personal” experiences into it.
Eerily similar language is used by social media executives to describe their own platforms. In 2016, for the launch of Facebook’s live-video product, Mark Zuckerberg told BuzzFeed that, “We built this big technology platform so we can go and support whatever the most personal and emotional and raw and visceral ways people want to communicate are as time goes on” (emphasis added). Facebook demands authenticity with policies as high-level as requiring that users provide their real names and a photo. It roots out “coordinated inauthentic behavior.” It has stood behind its targeted ad product, claiming it as a boon for small businesses who otherwise might not be able to afford ad campaigns or kept out by gatekeepers. The content that goes the most viral on Facebook is not intellectually stimulating, but emotionally stimulating—content designed to get you to share, whether it be a heartwarming video or a rage-inducing hot take. And like conservative media, battling against established, supposedly liberal outlets, Facebook is a similarly styled insurgent, taking on old-media standards of quality, format, and distribution.
Whether it’s giving platform manipulators like Ben Shapiro a pass or Instagram’s highly curated aesthetic, we know that Facebook’s relationship to authenticity is—like Fox News—actually very fluid. But publicly upholding it as a primary tenet is just one thing Facebook has in common with a conservative rhetorical style stretching back decades. The Murdoch-induced tabloidization of news, the rise of infotainment, and an emphasis on authenticity and significant emotional stakes seems tailor-made for a social media platform like Facebook, which like television, is most interested in keeping eyeballs glued to the screen. A wide variety of publishers have spent the better part of a decade trying to game Facebook—inspirational viral content mills like Upworthy and the Dodo, short-form video masters like the team at BuzzFeed’s Tasty—and all of them have suffered the catastrophic effects of a Facebook algorithm change. The only type of content that seems to survive each shift is conservative media, in part because it is nominally news content (which Facebook claims to care about) and also because the culture of conservative media is deeply aligned with the values that drive Facebook’s products.
The parallels between talk radio and social media are also abundantly clear. The thrill that an Instagram or Twitch user might get when they receive a callout from their favorite streamer is analogous to the interactivity of talk radio. Talk radio, Hemmer explained, “is more analogous to something like Facebook, because on Facebook, you feel like you're doing something when you share these stories, or when you comment on these stories. Watching television is much more passive.”
“So much of what the tech industry is based upon is that they're kind of like neutral technocrats. That's just not the way out of this particular crisis.”
UPenn professor and media historian Brian Rosenwald has spent a lot of time studying Limbaugh. The radio host, according to Rosenwald, “has developed what I call like a plausible deniability style, where you'll notice the people jump on him, and he'll say, ‘Well, I never said that.’ And so you go back and look at the transcript, and he didn't actually ever say it. He said, ‘Well, someone handed me this story’ or ‘I got this from this newsletter.’ And he's just passing it along. He's not saying he believes it, necessarily. But he's sharing something.” Facebook lets everyone be their own Rush Limbaugh, 24 hours a day, seven days a week, with next to no oversight.
It’s vitally important to sketch out these links and similarities between conservative media’s style, aesthetic and rhetoric with what Facebook wants from its users because Facebook itself is unwilling to do so. Part of the power of right-wing media comes from its effectiveness at “working the refs”—invoking technicalities to push arguments into the mainstream or skirt censure—whether those refs are the established news networks of old, or the dominant social networks of the new millennium. This has proven particularly useful on Facebook. Working the refs requires understanding the rules, understanding where the line is, and sprinting up to it without going over. It also helps to have powerful allies in Congress, ones who can credibly threaten regulation, deploying similar strategies. For all of the worries about outright fake news and misinformation, a lot of conservative media in a variety of formats operates in a vague “just asking questions” mode that falls just short of defamation and offers plausible deniability. This approach, developed over the last 70 years, has effectively neutered any action Facebook might take to improve the overall experience on its platform—to make its information more reliable and to make interactions less contentious.
A common tech maxim goes, “It’s not a bug, it’s a feature.” When Facebook concludes that certain platform changes, like ones reducing the reach of conspiracy theories and misinformation might “disproportionately” affect conservatives, that’s not a bug. It’s an indication that one political party—the right wing—traffics more in these formats and styles. The problems that Facebook is grappling with are not unique to Facebook: since the 1950s, conservative media has been very good at circumventing or infiltrating politically neutral territory and using it to expand its influence. Facebook is using poorly paid, overworked content moderators to try and combat seven decades of media patterns in a politically neutral way. It’s a fool’s errand, and one that if Zuckerberg ever cops to, he will have done so far too late.
Rosenwald offered up this analogy for Facebook’s fixation on party-neutral enforcement: “It's like referees in a football game saying they want to call the same number of penalties on each team, and before the game, they decided they're going to do an experiment: they tell one team, ‘We're going to call the same number of penalties on these two teams.’ The team they let know then is incentivized to commit penalties on every play, because they know they're not going to get called for more penalties.”
“So much of what the tech industry is based upon is that they're kind of like neutral technocrats,” Bauer said. “That's just not the way out of this particular crisis.”
“This is the problem with this cyber-utopian or tech-utopian rhetoric that emerges from Silicon Valley, and its emphasis on the conduits of communication—the infrastructure, the technical—is it's always a technical problem,” Peck added. “But what that does is it dismisses questions of culture and specifically political culture.”
“Most of the discourse and arguments around things like Facebook, don't take into account that we tend to point to these issues in our politics as though there are technological problems when they are social and political problems,” Hemmer says, “that not only aren't caused by these new technologies, but can't be fixed with technological solutions.”
The first step towards a workable solution then is to acknowledge the political reality, and the accumulated years of knowledge and experience that would easily identify the underlying meaning of certain platform expressions, and justify changes to how Facebook addresses partisan media. For a company so focused on examining trends, this is one that it has willfully ignored.
from Hacker News https://ift.tt/35HCobu
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.