Fake News in the Age of Facebook

The "fake news" problem isn't just about "alternative facts." The problem has more to do with the spin, the narrative, the context that inclines you to believe, for example, whether there was or was not collusion between Donald Trump's campaign and officials of the Russian government. And the problem is that 60 percent of Americans get their news through social media, mostly Facebook, which uses mysterious algorithms to customize each of our news feeds, selecting all and only what interests me, as computed from every time I press "Like" or forward an article to friends.

It's not just Facebook. Sit with a friend or, better yet, a friendly enemy — someone you know has political views contrary to your own — and, using your own devices, type the same entries into your respective Google search windows. Try "BP," standing for the oil company that used to be British Petroleum but tried rebranding itself as Beyond Petroleum. You and your friendly enemy — perhaps your crackpot brother-in-law? — are going to get different results from your searches because of what Google knows about each of you and what you've searched before. If you are an avid environmentalist and he's a rabid lefty, you'll get more results about the environmental effects of the Deepwater Horizon disaster while he'll get more about its nefarious corporate causes.

Trending Toward Personalization

Personalization has a long and illustrious history, best understood in contrast with its opposite, the mass market of the post-World War II boom. The structure of that marketplace featured mass production that reduced costs with economies of scale. The customer was a mass market that was only gradually differentiated, first by demographic characteristics — age, income and education — and later by psychographics such as likes, dislikes, values and personality traits.

From 1983 until 1986, I served as the director of research for SRI International's Values and Lifestyles Program. Our clients were mostly marketers trying to match their messages about the right product for the right customer through the right media. As early as the 1980s, it became clear that this matching game — never an issue for mass markets — would only become finer-grained as technologies and media evolved. Demographic and psychographic segments would be sliced and diced into subsegments until finally, with the advent of the internet, we arrived at markets of one.

As Farhad Manjoo describes the process in his book True Enough: Learning to Live in a Post-Fact Society:

"The mainstream is drying up. In some ways, we are returning to the freewheeling days before radio and television launched the very idea of mass media — the era of partisan newspapers and pamphleteers. But our niches, now, are more niche than ever before. We are entering what you might call the trillion-channel universe: over the last two decades, advances in technology… have helped turn each of us into producers, distributors and editors of our own media diet."

Personalization has its bright sides. It can make shopping easier: Instead of wandering aimlessly down the endless aisles of vast department stores, Amazon will guide you toward the book you might like next based on what it knows about your recent purchases. For the customer, personalization can mean that whatever messaging manages to reach her, she will not be subjected to blaring announcements to the mass market; the ads she receives will be about only the things she's interested in. They will be targeted.

And that's what Facebook can sell to its advertisers: its success at solving the matching game; its ability to put in front of you, next to your news feed, all and only those things it knows you and your friends are interested in.

Personalization reverses the polarity of the messaging in the marketplace. Instead of all push from producer to consumer, now it's pull from consumer to producer. Rather than passively listening to broadcasting from the networks, or narrowcasting from the cable channels, today's consumer is narrowcatching by pulling from the internet, via Google, whatever he or she wants to know.

Politics and the Filter Bubble

Thanks to personalization, marketers and customers can find each other more easily. But what is good for the marketplace and the consumer is not necessarily good for the polity and its citizens, as Eli Pariser makes clear in his book The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think.

"Ultimately, the filter bubble can affect your ability to choose how you want to live. To be the author of your life... you have to be aware of a diverse array of options and lifestyles. When you enter a filter bubble, you're letting the companies that construct it choose which options you're aware of. You may think you're the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you've clicked on in the past determines what you see next — a Web history you're doomed to repeat. You can get stuck in a static, ever-narrowing version of yourself — an endless you-loop."

Founding head of MIT's Media Lab and columnist for Wired magazine Nicholas Negroponte put it this way:

"Imagine a future in which your interface agent can read every newswire and newspaper and catch every TV and radio broadcast on the planet, and then construct a personalized summary. This kind of newspaper is printed in an edition of one... Call it the Daily Me."

From another authoritative source, Pariser quotes Google CEO Eric Schmidt:

"Most people will have personalized newsreading experiences on mobile-type devices that will largely replace their traditional reading of newspapers. And that kind of news consumption will be very personal, very targeted. It will remember what you know. It will suggest things that you might want to know. It will have advertising. Right? And it will be as convenient and fun as reading a traditional newspaper or magazine."

But what is convenient and fun for the reader is not always good for the citizen. Pariser himself says: "The filter bubble will often block out the things in our society that are important but complex or unpleasant. It renders them invisible. And it's not just the issues that disappear. Increasingly, it's the whole political process."

In a world that's becoming fragmented into friends of friends, "the news" becomes equally fragmented. In place of different views on the same world, people are living in different worlds. This ontological point is the main theme of Manjoo, who writes:

"While new technology eases connections between people, it also, paradoxically, facilitates a closeted view of the world, keeping us coiled tightly with those who share our ideas. In a world that lacks real gatekeepers and authority figures, and in which digital manipulation is so effortless, spin, conspiracy theories, myths, and outright lies may get the better of many of us."

Unintended Consequences

Now, this is not what the early inventors of and writers about the internet had in mind. Geniuses like Norbert Wiener, who helped invent cybernetics, and Douglas Engelbart, who invented the mouse, wanted to facilitate a more connected and friendly world. As Pariser describes what I'll call "The Dream" that united many of us in the San Francisco Bay Area during the lifetime of Wired magazine, "Despite their libertarian orientation, the writings of Esther Dyson, John Perry Barlow, and Kevin Kelly... fairly ache with a longing to return to an egalitarian world."

In his just-published World Without Mind: The Existential Threat of Big Tech, Franklin Foer cites Kevin Kelly's description of the ultimate book, following on Google's digitization of all books: "The real magic will come in the second act as each word in each book is cross-linked, clustered, cited, extracted, indexed, analyzed, annotated, remixed, reassembled and woven deeper into the culture than ever before."

Foer, who was not so incidentally the editor of the New Republic, a left-wing publication by any measure, expostulates on The Dream:

"There was a political corollary to this prelapsarian dream. Not only would volumes melt into one beautiful book, disagreements would fade too… As readers worked together to annotate and edit texts, they would find common ground. The path of the network takes our most contentious debates and leads them toward consensus. Facebook puts it this way: 'By enabling people from diverse backgrounds to easily connect and share their ideas, we can decrease world conflict in the short and long term.'"

But that's not exactly the way it's working out. As Fred Turner puts it in From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism:

... "to the degree that the libertarian rhetoric of self-reliance embraces a New Communalist vision of consciousness-centered, information-oriented elite, it can also permit a deep denial of the moral and material costs of the long-term shift toward network modes of production and ubiquitous computing."

And further:

"Even as they suggested that such a world would in fact represent a return to a more natural, more intimate state of being, writers such as Kevin Kelly, Esther Dyson, and John Perry Barlow deprived their many readers of a language with which to think about the complex ways in which embodiment shapes all of human life, about the natural and social infrastructures on which that life depends, and about the effects that digital technologies and the network mode of production might have on life and its essential infrastructures."

I think Turner is being a little rough on my friends and, after 32 years of crossing paths and working together in the San Francisco Bay area, I count each of them as a friend. But he has a point.

Fixing a Network Gone Wrong

Is there a remedy for this network gone wrong? According to Foer, Pariser, Manjoo, and Turner, the internet has evolved from unifying force for social solidarity to a divisive bubble machine. Foer calls for a regulatory fix, a Data Protection Authority akin to Elizabeth Warren's Consumer Financial Protection Bureau. But reviewers give that idea little credibility.

Foer's take on the current state of the news media veers from the apoplectic to the righteous:

"Google and Facebook … are, after all, organizing the entire output of humanity.

"Of course, this is not an innocent activity — even though the tech companies disavow any responsibility for the material they publish and promote. They plead that they are mere platforms, neutral utilities for everyone's use and everyone's benefit. When Facebook was assailed for abetting the onslaught of false news stories during the 2016 presidential campaign … Mark Zuckerberg initially disclaimed any culpability. 'Our goal is to give every person a voice,' he posted on Facebook, washing his hands of the matter. It's galling to watch Zuckerberg walk away from the catastrophic collapse of the news business and the degradation of American civic culture, because his site has played such a seminal role in both. Though Zuckerberg denies it, the process of guiding the public to information is a source of tremendous cultural and political power. In the olden days, we described that power as gatekeeping — and it was a sacred obligation."

While often as critical as Foer when lamenting the way social media has evolved, Pariser is ultimately more optimistic when it comes to the future. He quotes one of the inventors of the internet:

"'We create the Web,' Sir Tim Berners-Lee wrote. 'We choose what properties we want it to have and not have. It is by no means finished (and it's certainly not dead).' It's still possible to build information systems that introduce us to new ideas, that push us in new ways. It's still possible to create media that show us what we don't know, rather than reflecting what we do. It's still possible to erect systems that don't trap us in an endless loop of self-flattery about our own interests or shield us from fields of inquiry that aren't our own."

In the meantime, as an antidote to echo chambers like Fox News, MSNBC or Facebook's news feed, the four books cited in this column read like celebrations of the need for something like Stratfor.

Fake News in the Age of Facebook is republished with permission from Stratfor.com.

About the Author

Global Intelligence Analysis
randomness