On Facebook, Fake News, and Our Collective Loss of Reality
description

On Facebook, Fake News, and Our Collective Loss of Reality

Featured image by White House (Pete Souza) / Maison Blanche (Pete Souza) – The White House’s Photostream, Public Domain, Link

Facebook CEO Mark Zuckerberg recently penned a lofty 5700 word screed in which he outlined his grand dream for fixing our broken political system. In a nutshell, he thinks our problem is that our online communities are failing on five key measures: being supportive, being safe, being informed, being civically engaged, and being inclusive. While these make good bullet points for a slideshow, the devil is in the details, and it’s clear Mr. Zuckerberg doesn’t have any real solutions to offer. What’s more, Facebook itself has contributed to the problem of fake news and “low information voters” by magnifying the most sensational posts (which are often completely baseless), and by hiding content you don’t engage with (which is most often content you simply disagree with). Missing from his manifesto is an acknowledgement of that simple fact, which colors the rest of it. His proposal bears deeper inspection in light of that:

Being Supportive

Zuckerberg is awfully proud of the fact that there are groups on Facebook centered around helping people:

“A woman named Christina was diagnosed with a rare disorder called Epidermolysis Bullosa — and now she’s a member of a group that connects 2,400 people around the world so none of them have to suffer alone. A man named Matt was raising his two sons by himself and he started the Black Fathers group to help men share advice and encouragement as they raise their families. In San Diego, more than 4,000 military family members are part of a group that helps them make friends with other spouses. These communities don’t just interact online. They hold get-togethers, organize dinners, and support each other in their daily lives.”

Right. People sometimes use group functions on social media to do good things. Got it. They also use it for political platforms for militias and racist law enforcement groups, and Facebook’s ethnicity filtering system in its ad manager can make sure your horrible message reaches just the lily white folks you want it to. Zuckerberg says his solution to that is to make meaningful groups more visible, without actually clarifying what makes a group meaningful:

“We recently found that more than 100 million people on Facebook are members of what we call “very meaningful” groups. These are groups that upon joining, quickly become the most important part of our social network experience and an important part of our physical support structure.”

I imagine that joining a group of folks who all share your unabashed hatred of immigrants might “quickly become the most important part of our social network experience and an important part of our physical support structure.” As long as Facebook remains an ad platform whose sole value claims are ubiquity, and the ability to target by detailed social data, the “supportive groups” which are organically formed and made visible by Facebook’s algorithms are going to continue to be siloed around whatever echo chamber a user hides in.

Being Informed

Somehow, Zuckerberg manages to write the following without the slightest hint of irony or accountability:

“Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared. But the past year has also shown it may fragment our shared sense of reality. It is our responsibility to amplify the good effects and mitigate the bad — to continue increasing diversity while strengthening our common understanding so our community can create the greatest positive impact on the world.”

In the past ten years, Facebook has done more to “fragment our shared sense of reality” than any other force on this planet aside from religion. The edge rank and content visibility algorithms which reward “engagement” with greater visibility mean that the most contentious and sensational content perpetually sees the highest audience. Social theories regarding least common denominator messaging have been proven out as business models on Facebook with sites like Buzzfeed and Upworthy making millions on low quality posts catering to the base and the banal.

Snopes.com saw much of its growth driven by countering hoaxes and fake news on Facebook before finding itself a victim of lies attacking its credibility, roundly driven by Facebook posts asserting the site was Liberal propaganda.  Any previous attempt to provide fact checking by people has fared badly for Facebook, leading it to be accused of censoring both conservatives and liberals. As long as Facebook relies on community reporting to drive its censorship algorithms, content is going to be attacked ideologically as opposed to rationally.

How does Zuckerberg intend to solve this?

“A more effective approach is to show a range of perspectives, let people see where their views are on a spectrum and come to a conclusion on what they think is right. Over time, our community will identify which sources provide a complete range of perspectives so that content will naturally surface more.”

Ah. He’s going to “show a range of perspectives” so that people can “come to a conclusion on what they think is right”. In other words, exactly what’s happening now, except with more of his fiddling about with what you see of what your friends share.

This sort of thinking is EXACTLY why a commercial entity who makes its money on page views is wildly incapable of making any significant improvement to our political conversation online. Facts are facts regardless of whether or not someone believes them. There should be clear delineation between facts and speculation, because facts are provable. One of the most frustrating things about Facebook conversation is when people respond to facts by saying “well, we obviously have different beliefs”. This idea that beliefs somehow trump objective reality is fundamental to the breakdown occurring in American society right now. If Zuckerberg was committed to fixing that, he’d go hard, and put a “factual” label on journalistic organizations who follow a mature code of ethics and consistently demonstrate integrity in reporting. But of course, he’ll never do that, because he knows that it will create an outcry from those who disagree with the news because of their beliefs (and what a bizarre sentence that is to type out), and that outcry is bad for the bottom line.

Even the good ideas he has are reliant on the algorithm, and thus, subject to abuse:

“Fortunately, there are clear steps we can take to correct these effects. For example, we noticed some people share stories based on sensational headlines without ever reading the story. In general, if you become less likely to share a story after reading it, that’s a good sign the headline was sensational. If you’re more likely to share a story after reading it, that’s often a sign of good in-depth content. We recently started reducing sensationalism in News Feed by taking this into account for pieces of content, and going forward signals like this will identify sensational publishers as well.”

This sounds great on the surface, until you realize that sites like NaturalNews.com and Mercola.com (I’m not linking because I don’t want to send any traffic there) often have avid readers, despite being sensationalized crap. People pore over their blog posts looking for solutions to personal issues, while folks may share an AP or Reuters post based on the headline alone, and then go read it while a discussion is forming. Simple algorithms don’t account for the complexity of human behavior.

Being Civically Engaged

Let me summarize what Zuckerberg says about being civically engaged: blah blah blah blah voting is important blah blah blah blah democracy.

Being Inclusive

In the closest thing to a mea culpa that appears in the entire piece, Zuckerberg notes that they’ve had problems:

“This has been painful for me because I often agree with those criticizing us that we’re making mistakes. These mistakes are almost never because we hold ideological positions at odds with the community, but instead are operational scaling issues.”

Well, yeah, that’s why you’re not the platform to fix these problems. In fact, the more you meddle with the content flow, the worse it gets. As it turns out though, he’s just getting warmed up:

“I may be okay with more politically charged speech but not want to see anything sexually suggestive, while you may be okay with nudity but not want to see offensive speech. Similarly, you may want to share a violent video in a protest without worrying that you’re going to bother friends who don’t want to see it. And just as it’s a bad experience to see objectionable content, it’s also a terrible experience to be told we can’t share something we feel is important.

The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.”

OK, so to be more inclusive, you’re going to default to setting content filters specific to your region. And how big is a region? Is the South a region? Central Florida? Is the goal eventually to have a thousand item list of acceptable versus unacceptable content? And more importantly, doesn’t this explicitly contradict the idea above of helping people be informed by exposing them to different ideas? How is that in any way “inclusive”?

We don’t need a social media platform protecting us from offensive speech. That is literally how we got to where we are now, with Facebook siloing us according to our culture and ideology, magnifying in-group agreement, and polarizing out-group thoughts and behavior. What we need is a collective reality check. Politicians (and President Trump especially) have been systematically attacking the integrity of the fourth estate in order to silence the only group historically who has stood up to them by publishing the facts. When I see the sheer number of people on any given day on Facebook repeating these “fake news” accusations against mainstream journalism organizations, I’m floored by how effective those attacks are.

On the other hand, when I see this sort of self-congratulatory mental masturbation from someone like Zuckerberg, I’m floored by the self importance and privilege his perspective shows. Perhaps he’d like to join us down here for a group reality check.


Written By: Chris Jenkins

Chris has been a political gadfly ever since he became a pundit for one of the earliest culture and politics webzines, Spark-Online.com, back in 1999. A registered independent moderate, his leftist friends think he's too right, and his rightist friends think he's too left. Either way, he's correct about everything.