Site icon Independent Lens

Do You Live in a News Bubble?

By Brooke Shelby Biggs


Whether you think of her as a visionary or a kook, Marion Stokes was far ahead of her time in addressing a problem in American media that continues to this day: a lack of diversity among the gatekeepers of traditional media. When she confronted white male panelists on the Philadelphia local public affairs program Input about the ways in which they systematically kept the voices of women and people of color out of the conversation and away from the levers of control, she might well have been addressing the media of 2020. 

Indeed, in June, the executive editor of the Philadelphia Inquirer stepped down when the lack of diversity in his newsroom resulted in an offensive headline making print during the George Floyd/Black Lives Matter protests. In Pittsburgh, a black reporter was pulled off the protest beat by his editors; the New York Times’ opinion page editor resigned under fire; Bon Appétit’s editor resigned after a racially flippant tweet. The problem of diversity among media gatekeepers is nothing new, and the attention on it today would undoubtedly please Stokes. 

Each wave of media realignment and technological advancement promises to alleviate that problem by putting the means of media production and distribution of media into more and more hands. Cable television’s revolutionary potential was evident to Stokes from the start. When I covered media for Wired in the early days of the internet, we young idealists were giddy about the “democratization of media” we believed we were witnessing.

Forty years after the birth of 24-hour cable news and 25 years since the dawn of the web, we might ask ourselves: Do more voices always mean greater diversity? Does a greater volume of information necessarily get us closer to the truth? The sheer volume of information available at our fingertips in the 21st century — both good and bad — is far too vast for us to manage efficiently on our own.

gif from ladder.io blog

It would be impossible for us individually to find what we need, what is most useful, valid and relevant, by sifting through millions of terabytes of data ourselves each time. Thus, we have employed a new breed of modern media gatekeepers. And they are as problematic as the ones Stokes put on the spot back in the 1970s. Or perhaps more so, because they aren’t human. They’re algorithms.

An algorithm is simply a bit of instruction that uses selected data about a user to determine what a computer or website does or shows. Anytime you see something as you browse such as “You may also like” or “Recommended for you,” what you’re seeing are the results of an algorithm that uses personal data about you — what you tend to like, read, buy, watch, or even where you’ve been and who you were with — and which is attempting to serve up more of the same. Any time you search on Google, browse Netflix, watch YouTube videos, scroll through your Facebook feed, or shop online, algorithms are working silently in the background to gather more data about you so they can give you more of what you want. As a result, if two people search Google for the same thing, they would not necessarily get the same results.

Because of the nature of the technology industry, especially in Silicon Valley, most of these algorithms are written by white people, and mostly by men. Algorithms can be biased; they can be sexist, and they can be racist. Algorithms are only as good as the data they are fed; if they are fed only one kind of data they will only spit out one kind of result (a concept in technology knowns as “GIGO”: garbage in, garbage out).

While algorithms help us manage the firehose of information coming at us online, among their unintended consequences is that they artificially limit our access to information and ideas outside of the comfortable and familiar. They appeal to and reinforce our logical biases and help us avoid the discomfort of cognitive dissonance. They can even make us believe we know more than we really do.

The result of an algorithm-mediated information diet is two-fold. First, because we all experience the world through media uniquely molded by our own predilections, we no longer have a common experience or understanding of the world. “Where were you when you heard President Kennedy was shot?” has been replaced by “did you see that viral video?” or “vaccines are a conspiracy; look it up.”

There’s a popular nostalgic meme about Walter Cronkite that characterizes him as someone who told the truth and never had an opinion and was reliable and faultless. But of course, when Cronkite was working, there were only three networks, all run by white men, and the news was whatever they decided it was. That so much power and control was afforded these few and similar arbiters of information was what infuriated Marion Stokes about traditional media. But it would probably frustrate her about digital media today, in new yet familiar ways. We have just replaced one variety of blindered, inadequate gatekeepers with fancy, new, but equally flawed and insufficient ones.

The other problem with algorithms is that they trap us in what is known as “filter bubbles” — the digital equivalent of an echo chamber, in which our incoming sources of information have been so customized to us that they simply confirm and reinforce what we already believe. They insulate us from novel and challenging ideas and distort our perception of reality. And oftentimes they do this with our explicit participation when we eagerly provide information that helps websites and platforms “customize” and “personalize” our experiences.


Writings and thoughts on how to minimize the size and impact of your filter bubble


Brooke Shelby Biggs is a San Francisco-based writer and media literacy educator.

Exit mobile version