Filter bubble

[2] Consequently, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles, resulting in a limited and customized view of the world.

"[8] An internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in [their] queue, reading news stories," and so forth.

"[20] Pariser also reports: According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons.

In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results.

[26] According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation.

"[28] A filter bubble has been described as exacerbating a phenomenon that called splinternet or cyberbalkanization,[Note 1] which happens when the internet becomes divided into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views.

"[33] Both "echo chambers" and "filter bubbles" describe situations where individuals are exposed to a narrow range of opinions and perspectives that reinforce their existing beliefs and biases, but there are some subtle differences between the two, especially in practices surrounding social media.

[34][35] Specific to news media, an echo chamber is a metaphorical description of a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system.

[10][failed verification] Organizations such as the Washington Post, The New York Times, and others have experimented with creating new personalized information services, with the aim of tailoring search results to those that users are likely to like or agree with.

[47] Subsequently, the study explained a lack of empirical data for the existence of filter bubbles across disciplines[12] and suggested that the effects attributed to them may stem more from preexisting ideological biases than from algorithms.

Similar views can be found in other academic projects, which also address concerns with the definitions of filter bubbles and the relationships between ideological and technological factors associated with them.

"[49] A study by Oxford, Stanford, and Microsoft researchers examined the browsing histories of 1.2 million U.S. users of the Bing Toolbar add-on for Internet Explorer between March and May 2013.

[55] A recent study from Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro suggest that online media isn't the driving force for political polarization.

Although algorithms and filter bubbles weaken content diversity, this study reveals that political polarization trends are primarily driven by pre-existing views and failure to recognize outside sources.

[60] Though the study found that only about 15–20% of the average user's Facebook friends subscribe to the opposite side of the political spectrum, Julia Kaman from Vox theorized that this could have potentially positive implications for viewpoint diversity.

Similarly, a study of Twitter's filter bubbles by New York University concluded that "Individuals now have access to a wider span of viewpoints about news events, and most of this information is not coming through the traditional channels, but either directly from political actors or through their friends and relatives.

"[63] According to these studies, social media may be diversifying information and opinions users come into contact with, though there is much speculation around filter bubbles and their ability to create deeper political polarization.

Pariser argues that filter bubbles reinforce a sense of social homogeneity, which weakens ties between people with potentially diverging interests and viewpoints.

Users can take many actions to burst through their filter bubbles, for example by making a conscious effort to evaluate what information they are exposing themselves to, and by thinking critically about whether they are engaging with a broad range of content.

[75] Some browser plug-ins are aimed to help people step out of their filter bubbles and make them aware of their personal perspectives; thus, these media show content that contradicts with their beliefs and opinions.

Questions that involve bias and/or controversial opinions will not be addressed until a later time, prompting a larger problem that exists still: whether the search engine acts either as an arbiter of truth or as a knowledgeable guide by which to make decisions by.

The +MITI would serve as a collective effort to develop products, research, and community-based solutions to combat the effects of filter bubbles and the proliferation of fake news.

Mozilla's Open Innovation team leads the initiative, striving to combat misinformation, with a specific focus on the product with regards to literacy, research and creative interventions.

[86] Scholars have begun considering the effect of filter bubbles on the users of social media from an ethical standpoint, particularly concerning the areas of personal freedom, security, and information bias.

[89] Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of The Filter Bubble, have expressed concerns regarding the risks of privacy and information polarization.

[95] In light of the 2016 U.S. presidential election scholars have likewise expressed concerns about the effect of filter bubbles on democracy and democratic processes, as well as the rise of "ideological media".

[11] These scholars fear that users will be unable to "[think] beyond [their] narrow self-interest" as filter bubbles create personalized social feeds, isolating them from diverse points of view and their surrounding communities.

[97][98][99] A related concern is in fact how filter bubbles contribute to the proliferation of "fake news" and how this may influence political leaning, including how users vote.

[11][100][101] Revelations in March 2018 of Cambridge Analytica's harvesting and use of user data for at least 87 million Facebook profiles during the 2016 presidential election highlight the ethical implications of filter bubbles.

[102] Co-founder and whistleblower of Cambridge Analytica Christopher Wylie, detailed how the firm had the ability to develop "psychographic" profiles of those users and use the information to shape their voting behavior.

Social media inadvertently isolates users into their own ideological filter bubbles, according to internet activist Eli Pariser
The term filter bubble was coined by internet activist Eli Pariser, circa 2010.
Visualization of the process and growth of two social media bots used in the 2019 Weibo study. The diagrams represent two aspects of the structure of filter bubbles, according to the study: large concentrations of users around single topics and a uni-directional, star-like structure that impacts key information flows.