Lentis/News Echo Chambers

An Echo Chamber is a metaphorical description of a situation in which beliefs and values are amplified or reinforced through repetition and validation from others inside a closed system. The term is based off of an acoustic echo chamber, a room where sounds reverberate in a hollow enclosure. A News Echo Chamber is created when people choose to receive news only from sources they already trust. Trusted sources may be friends, family, particular websites, or particular television channels. When visiting an echo chamber, people only find information that reinforces their existing views, which is an unintentional example of confirmation bias.

Causes
Echo chambers are created with a few key factors: Isolation from opposing viewpoints, binary worldview, distrust in other views, and confirmation bias. “Echo chambers isolate their members, not by cutting off their lines of communication to the world, but by changing whom they trust.” Once users are isolated from opposing viewpoints, they may enter a feedback loop where their opinions are repeatedly confirmed. Because of this, they may prefer to seek out individuals with similar values and become less likely to trust important decisions of people whose values differ from their own.

One of the main causes of News Echo Chambers is the Internet’s facilitation of alternative news sites. Many of these sites specifically tell their audience what they want to hear, regardless of journalistic ethics. With these sites, factual accuracy cannot be taken for granted. Despite many claims being able to be debunked with a quick search, users of such sites are encouraged that any news outlet sporting an opposing viewpoint is fake or unreliable. Many news outlets claim “fake news” on reputable sites while touting the accuracy of their own, leading to confusion as to which outlets are reliable. This leads to users only trusting news from news outlets that already match what they believe.

Echo Chambers in Social Media
Echo chambers can exist anywhere, with social media sites being especially vulnerable. Many users find a majority of their news stemming from shared posts on social media. When like-minded people are surrounded by one another, they are more likely to share articles agreeing with their viewpoints and are less likely to challenge false rumors about opposing viewpoints. This can create a feedback loop where you only see similar articles online. Seeing these repeated viewpoints can "trick our brains into thinking that this is the reality." Due to the nature of online news outlets and social media, content filtering algorithms decide which content users see. The mixture of a user’s tendency to group with like-minded people along with content filtering algorithms creates a system where one can easily fall into an echo chamber.

Reddit
, a popular social media forum, naturally separates content through their user created communities called subreddits. Users choose which subreddits to follow, and can post, upvote, and view content from each of them. Communities such as r/LateStageCapitalism and r/The_Donald create isolated environments through their subreddit rules that can lead to distrust of other viewpoints. Both subreddits allow only positive discussion of each ideology, and ban users who mention any contrary beliefs. Reddit user Jelvinjs7 states, “Reddit is only an echo chamber if people choose a user experience that makes it an echo chamber.” Users on Reddit are largely anonymous to the community,"making it particularly easy to dismiss other’s views, creating a more hostile environment for debate." However, this is not always the case. Several communities, such as r/ChangeMyView, have been created solely to discuss opposing viewpoints. Users may seek out these communities to reevaluate their own viewpoints.

Facebook
Shared content on is often shared by friends of a user. Many users only add friends who are like-minded to themselves, resulting in content that doesn't break out of the user's comfort zones. Users are also more likely to believe content that confirms their beliefs, even if the content was deliberately satirical. Because more time spent on Facebook results in more revenue, Facebook incentivizes users to stay on their site as long as possible. “Everything on your Facebook feed is curated and presented to you by an algorithm seeking to maximise your engagement by only showing you things that it thinks you will like and respond to.” And because a user is more likely to click on the links that support their viewpoints regardless of truth, unaware users may continue to see content tailored to their beliefs resulting in a skewed worldview.

Youtube
YouTube’s algorithm constantly adapts to recommend the videos it thinks is most likely to get users to stay on the site. Because they make money from ad revenue, Youtube wants to keep users on their site for as long as possible. As one Youtube user states, “all I get are videos that I’ve already watched, or channels that I already clicked 20 times...” Many users feel similarly too, that the suggestions are repetitive. This can have the unintended consequence of creating echo chambers of content, especially with political or news channels. If the algorithm notices a user watching a video affiliated with a political party, it will recommend more and more videos from that party because it thinks they are more likely to be clicked on.

Real-World Implications of Echo Chambers
Anti-Vaccination groups are an example of an echo chamber that has gained tremendous popularity in recent years. The spread of disinformation about vaccines has caused a resurgence of measles, a disease that was eliminated in the United States in 2002. In 2019, the United States is in the middle of the largest measles outbreak since 1992. In addition, the Anti-Vaccination movement is responsible for many state-by-state initiatives to allow parents to opt-out of mandatory vaccination. Such ideas are propagated inside of communities like closed Facebook groups and forums that quash any ideas that don’t align with the group.

Another example of echo chambers having real world consequences are the Climate-Change Deniers. Climate-change denial has been found to be linked closely with political conservatism, with 95% of liberal Democrats, 88% of moderate/conservative Democrats and 68% of liberal/moderate Republicans, but only 40% of conservative Republicans believe in global warming. Because the belief in global warming is tied so closely to political belief, it lends itself to be a topic that forms echo chambers on the internet. Feldman, Myers, Hmielowski, and Leiserowitz found in 2014 that partisan media sources do influence individuals’ beliefs about global warming. “Specifically, the use of conservative media sources such as Fox News and Rush Limbaugh is associated with the belief that global warming is not happening and greater opposition to climate policies, whereas use of non-conservative media such as network TV news, CNN, MSNBC, and NPR is associated with the belief that climate change is happening and greater policy support.”

Conclusions
In the age of the Internet, information has become increasingly available. Online news outlets’ goals of gaining viewers will naturally lead to users viewing polarizing content. With vast amounts of information available to us, users must actively choose to break the boundaries of their content viewing. Political clusters in the U.S. are becoming more polarized. Future research could be done investigating echo chambers’ roles in determining political outcomes, or their role in key issues such as the anti-vaccination movement or climate-change denial.