Explore chapters and articles related to this topic
Social Network Relationships and Structures
Published in Michael Muhlmeyer, Shaurya Agarwal, Information Spread in a Social Media Age, 2021
Michael Muhlmeyer, Shaurya Agarwal
In a modern popular social media context, this phenomenon is often known as a filter “bubble” [34]. Similar to the concept of a social “echo chamber”, a social filter bubble is the isolation of individual thoughts, perceptions, and news from opposing viewpoints due to their current belief systems, social media circles, and internet search tendencies. Evolving non-transparent technology has made filter bubbles increasingly intense, as personalized news streams, ads, and searches begin to dominate typical internet activity. Growing concern has arisen as to whether this trend is harming democratic ideals as these concepts enter public consciousness [35] following the recent, social media internet attributed, 2016 United States presidential election results. Additionally, the knowledge of the existence of fake news and filter bubbles has eroded some public trust in traditional television, newspaper, and internet journalism. The graph in Figure 4.5 exemplifies an increasing trend of widespread distrust of mass media over the years in the United States.
At the Crossroad
Published in Anastasia Powell, Gregory Stratton, Robin Cameron, Digital Criminology, 2018
Anastasia Powell, Gregory Stratton, Robin Cameron
Social commentators and scholars alike have identified the self-affirming nature of individuals’ online engagement. For instance, US legal scholar Cass Sunstein (2004, 2009, 2017), has noted that one of the more striking features of digital society is the capacity for users to actively curate and ‘filter’ what they see. Such filtering he describes as a ‘mixed blessing’; simultaneously allowing individuals to filter out ‘noise’ in an increasingly information-heavy digital life and restricting engagement with a diversity of topics and points of view. ‘The implication,’ Sunstein suggests, ‘is that groups of people, especially if they are like-minded, will end up thinking the same thing they thought before—but in more extreme form, and sometimes in a much more extreme form’ (Sunstein, 2004, p. 58). In this way, a ‘filter bubble’ may contribute to further polarising of public sentiment and political ideology, in ways that are not only damaging for democratic engagement but that amplify cultures of hate (discussed further in Chapter 6). Indeed, several studies have found evidence that individuals tend to actively choose and ‘follow’ news outlets that are aligned with their own political opinions (Garrett, 2009; Iyengar & Hahn, 2009; Munson & Resnick, 2010).
Normative or Effective? The Role of News Diversity and Trust in News Recommendation Services
Published in International Journal of Human–Computer Interaction, 2023
Although using algorithms to recommend news may help overcome the issue of information overload, it may also cause a decrease in news diversity (Fletcher & Nielsen, 2018) as it may limit the range of information that users are exposed to, leading to problems such as the filter bubble phenomenon (Pariser, 2011). A filter bubble refers to the situation in which algorithms do not provide users with news or information that is unpopular or does not meet their preferences, interests, or beliefs. Therefore, users are exposed only to a narrow scope of news or information (Fletcher & Nielsen, 2018; Pariser, 2011). Therefore, a filter bubble indicates the state of extremely low diversity of the news articles recommended by the algorithms. News diversity is very important, as news can significantly shape the political orientation and public opinion of news users (Beam & Kosicki, 2014; Fletcher & Nielsen, 2018; Hannak et al., 2013). If the level of news diversity is low and users are only provided with a limited range of information, they may form narrow-minded opinions (Fletcher & Nielsen, 2018; Haim et al., 2018).
Towards Personalized Movie Selection for Wellness: Investigating Event-Inspired Movies
Published in International Journal of Human–Computer Interaction, 2020
Sharon Lynn Chu, Sarah Brown, Hannah Park, Brittnie Spornhauer
This view of the notion of personalized movie recommendations is related to a problem that some have called the ‘filter bubble’. The filter bubble refers to the idea that “recommenders isolate us from a diversity of viewpoints, content, and experiences, and thus make us less likely to discover and learn new things” (Pariser, 2011). The complacency current approaches to recommendations may instill may perhaps then lead people to choosing more instant gratification movies (e.g., action movies) over substantive educational experiences (Knijnenburg et al., 2016).
Designing an Algorithm-Driven Text Generation System for Personalized and Interactive News Reading
Published in International Journal of Human–Computer Interaction, 2019
The problems raised by the filter bubble are that the process of information screening is not transparent, and it takes away the freedom of choice in selective consumption of information. It limits the interaction between people with diverse perspectives and opinions and the opportunity to solve problems via deliberative processes (Bozdag & van den Hoven, 2015).