It's Facebook's fault Donald Trump won
How the hell did this happen?
How did the executioner from The Apprentice manage to win this election?
As the nation struggles to pick up the pieces of Trump's surprise victory, many Democrats, pundits and journalists are left looking for something, or someone to blame for it.
It could have been any number of factors, really. America's bewildering adoption of white nationalism as a platform. The failure of liberal politics to enact economic change. The deep-seated racism and misogyny of the American electorate. Wikileaks. Hillary Clinton's imperfect candidacy. Each of these contributed in its own way to our nation electing an inexperienced, unpredictable bigot that, until recently, no one in the political mainstream would endorse. But, we'd like to offer a different reason.
Facebook. Facebook did this.
Globally, about a billion people use Facebook, including 72 percent of adult Americans. Forty-four percent (or 170 million) of those Facebook users say the site is where they get their news, a figure which is no doubt increased exponentially by Instagram, WhatsApp, Twitter and roughly 50 other social platforms through which the company flexes a considerable amount of influence. For reference, Facebook's readership is several orders of magnitude larger than the circulation of the country's two largest media outlets, USA Today and the Wall Street Journal, which only boast 21 million and 2.2 million readers respectively.
This means that Facebook has access to an audience of unprecedented size and diversity, a clout which gives its readers the mistaken impression that the news they find there is accurate and reputable.
Well ... not so much. Thanks to its labyrinthine newsfeed algorithms, biased target advertising, trending boxes and inability to sort real journalism from fake clickbait articles proliferated by Macedonian teenagers, Facebook has systematically become the largest distributor of misinformation in written history. Think that had anything to do with who we're calling "President-Elect" now?
Here, we examine the ways in which these factors lead to the most inept man on earth becoming the most powerful man on earth.
Facebook is the best place on earth for advertisers who want to target specific groups. Using massive amounts of personal data collected about its users, Facebook enables the micro-targeting of ultra-niche audience categories — such as swing state voters concerned about abortion — to deliver news and stories they know will generate the most engagement. In fact, ProPublica uncovered nearly 50,000 unique categories in which it places its users.
But, Facebook doesn't just allow for targeting — it also allows for exclusion.
Facebook’s system readily allows advertisers to exclude black, Hispanic, and other “ethnic affinities” from seeing certain ads. That means that, if you're a pro-Trump media platform, you can uncheck the "Hispanic" and "black" boxes so that only white people of a certain background see your articles. And thus, very particular groups of people receive very particular media.
Here, a screenshot shows Facebooks option for racial advertising in its housing section:
Wow! It doesn't take a rocket scientist to see that poor, uneducated whites being served pro-Trump propaganda wouldn't necessarily come across any articles about his history of tax evasion or the accusations of child rape.
Lo and behold, guess who voted for Trump? Mostly uneducated whites ... of which there are a lot in this country.
If you're wondering if that's even legal, well, it's kind of not. The Civil Rights Act of 1964 prohibits the “printing or publication of notices or advertisements indicating prohibited preference, limitation, specification or discrimination” in employment recruitment.
How does Facebook get around that then? Well, by simultaneously allowing and condemning it.
"We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law,” says Steve Satterfield, privacy and public policy manager at Facebook. “We take prompt enforcement action when we determine that ads violate our policies."
Whether that action happens soon enough is up for discussion.
The trending box
It's hard to think that a little thing like Facebook's trending box could influence people's opinions so vastly, but an increasing body of research has discovered that when people are undecided about an issue, a trusted source of information can easily help them form an opinion. Facebook, as we mentioned above is the most trusted source of information today. It has the most influence over people's beliefs.
ProPublica explains this in greater depth: "Some demographic groups are especially susceptible to this kind of manipulation. A biased feed could potentially shift far more people than a biased source like Fox News because people expect a feed on a platform like Facebook to be impartial and unbiased. The more people trust the feed, the greater the influence, in part because a feed is a gateway: It links people to other material, some of which they read at length. When links to certain material have been omitted from a feed, fewer people read it, and its influence is diminished."
Considering the ability Facebook gives advertisers and individuals to micro-target specific populations, it's easy to see how biased feeds could proliferate on anyone's account. News starts trending when more people interact with them on their feeds, which in turns fuels a positive feedback loop in which certain stories are popularized amongst certain groups.
And so, people are only fed trending news. Trending news is news that has the most engagement. Sensationalized news gets the most engagement.
Starting to get it?
The last, and most glaringly evident way in which Facebook contributed to Trumps victory is its inability — and refusal — to deal with the problem of fake news.
Fake news isn't problem unique to Facebook by any means, but thanks to the site's gargantuan audience and the way in which its algorithms influence sharing — for example, the emotionally charged action of sharing, or the "show me more like this" feedback option — it's a place where false publishers can thrive, luring users off Facebook and onto their own sites where they encounter stories that are often completely fictional, exaggerated and incorrect.
During the entire election, shady stories like these proliferated all over Facebook. False, sensationalized "news" like "The Pope endorses Trump," "Hillary Clinton bought $137 million in illegal arms" and "The Clintons bought a $200 million house in the Maldives" were what Facebook users encountered, many of which received thousands — if not millions — of likes, comments and shares. And even though valiant efforts by fact-checking organizations like Snopes attempted to discredit some of the false journalism, Facebook's sharing and privacy settings meant many of their own corrections got lost in the cosmic galaxy of fake news. Unsurprisingly, this made it hard for the average person to tell what was real, and what was not.
Interestingly, while both liberal and conservative fake news publishers post plenty of bullshit stories on Facebook, conservative media sources do so twice as often as liberal outlets. An analysis by BuzzFeed found that 38 percent of posts shared from three large right wing Facebook groups included “false or misleading information” and that three similarly large left wing pages did so 19 percent of the time. Neither side is perfect in this matter obviously, but when you consider that twice as many false stories populated targeted conservative users' pages, it's clear how that would translate into millions of pro-Trump votes.
However, what is a uniquely Republican issue seems to be the heavy validation given to false and unsubstantiated news. Throughout his campaign, Trump famously and repeatedly utilized false news stories conspiracy theories to further his agenda. Whether that involved the Obama birth certificate issue, calling climate change a hoax or questioning Hillary's health, he encouraged his followers not to trust "corrupt traditional media," so that they were more likely to believe in the efficacy of Facebook's wealth of fake, right-wing news.
Even more infuriatingly, Facebook users weren't even reading these stories. Instead, they merely reacted to the headlines, which they'd promptly distribute themselves. This no doubt has something to do with Facebook recently updating its headline algorithm to prioritize and popularize headlines with the most user engagement. The algorithm requires that posts by pages or people with the most likes or friends appear closer to the top of your news feed, and, guess who has tons of these things?
Say it with us: fake news sites.
Here, for example, is a real-life sighting of a guy telling a woman that Hillary and and her aide Huma Abedin are lesbian lovers, based solely on “material that appeared to have been printed off the internet.”
Of course, lies and exaggeration like this are nothing new in the world of politics. They've always been there; have always been the foundation of campaigning. It's just that now, Facebook has simply made it easier to disperse.
And, it receives no punishment for doing so because no one notices. People have a strong, innate bias towards media that confirms their beliefs, and an even stronger tendency to believe what they're reading is true, provided they like it. This is the core of a widespread phenomenon called "confirmation bias," in which is the tendency to interpret information in a way that confirms one's own theories.
These beliefs are what's manipulated during political organization, something Facebook is now essential for.
As New York Magazine put it exceptionally well in a recent article on the topic of how Trump was able to manipulate Facebook during the election:
As Obama understood well in 2008, the internet provides political candidates a previously unimaginable opportunity to identify, communicate with, and organize supporters — an opportunity that, significantly, exists outside the traditional party apparatus. Trump, like Obama before him, was able to connect with voters outside the more stifling confines of political-party organizing. He was able to rally a little under half of America because he could express a political position people haven't had a voice for in years — white welfare-state ethno-nationalism — one hat the party would once have choked off for threatening its delicate coalition of business interests and white workers ... In fact, the grass roots nature of the Trump campaign originated on Facebook, where supporters were able to connect to each other as well as their candidate and, gave them platforms far beyond what even the largest Establishment media organizations might have imagined, and allowed them to effectively self-organize outside the party structure. Who needs a GOTV database when you have millions of voters worked into a frenzy by nine months of sharing impassioned lies on Facebook, encouraging each other to participate?
The volume and visibility of the "impassioned lies" Facebook let proliferate was, if you'll remember, overwhelming. So, people had to be selective with what they read and shared. They did so not to educate themselves, but because Facebook made it hard to tell what was news and what was fiction. And of course, to confirm what they already believed to be true. Again, confirmation bias it its finest.
All this has happened before. Every time a new species of media crops up, it challenges the structure and the scale of political power and the media that covers it. With each new innovation — radio, television, the Internet — new voices are heard and new ways of telling stories emerge. It just so happens that the ones we're finally hearing on Facebook right now are ones of fear and hate. For all its good intention and humanitarian facade, Facebook is as much a vehicle for knowledge as it is propaganda, and, as we watch the nation's first orange president assume the throne, that's something we'll have to grapple with until something better comes along.
If it wants to address these challenges, Facebook will need to change its business model. On a site where powerful, factually accurate content is less important than whether it's liked, shared or monetized, the potential for harm abounds. These “engagement” metrics mutate the environment of true journalism, which makes it easy as pie for clickbait, hyperbole and misinformation squeeze, unnoticed, between your cousin's inane baby shower photos.
On a platform with so many engaged users, it's unjust, condescending and dire to willingly feed them this sort of bullshit rather than providing them with meaningful, well-researched stories.
Look where it got us. Until they update the way truth is shared, it's going to keep getting us there.