A Conceptual Tool to Eliminate Filter Bubbles in Social Networks

Authors

  • Alireza Amrollahi Macquarie Business School

DOI:

https://doi.org/10.3127/ajis.v25i0.2867

Keywords:

filter bubble, social networks, prescriptive study, information bubble

Abstract

Reliance on social media as a source of information has lead to several challenges, including the limitation of sources to viewers’ preferences and desires, also known as filter bubbles. The formation of filter bubbles is a known risk to democracy. It can bring negative consequences like polarisation of the society, users’ tendency to extremist viewpoints and the proliferation of fake news. Previous studies have focused on specific aspects and paid less attention to a holistic approach for eliminating the notion. The current study, however, aims to propose a model for an integrated tool that assists users in avoiding filter bubbles in social networks. To this end, a systematic literature review has been undertaken, and initially, 571 papers in six top-ranked scientific databases have been identified. After excluding irrelevant studies and performing an in-depth analysis of the remaining papers, a classification of research studies is proposed. This classification is then used to introduce an overall architecture for an integrated tool that synthesises all previous studies and offers new features for avoiding filter bubbles. The study explains the components and features of the proposed architecture and concludes with a list of implications for the recommended tool.

References

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives 31(2), 211–236.

Amrollahi, A., Ghapanchi, A. H., & Talaei-Khoei, A. (2014). Three decades of research on strategic information system plan development. Communications of the Association for Information Systems 34(1), 84.

Amrollahi, A., & McBride, N. (2019). How to burst the bubble in social networks? In 24th UK Academy for Information Systems international conference. Oxford, UK.

TK, A., George, K., & Thomas, J. P. (2015). An empirical approach to detection of topic bubbles in tweets. 2015 IEEE/ACM 2nd international symposium on big data computing (BDC) (pp. 31–40). New York: IEEE.

Awan, I. (2017). Cyber-extremism: Isis and the power of social media. Society 54(2), 138–149.

Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. F., … Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarisation. Proceedings of the National Academy of Sciences 115(37), 9216–9221.

Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science 348(6239), 1130–1132.

Bhatt, S., Joglekar, S., Bano, S., & Sastry, N. (2018). Illuminating an ecosystem of partisan websites. arXiv preprint arXiv:1803.03576.

Bozdag, E., & Timmermans, J. (2011). Values in the filter bubble ethics of personalisation algorithms in cloud computing. 1st international workshop on values in design–Building bridges between RE, HCI and ethics. Lisbon, Portugal.

Bozdag, E., Gao, Q., Houben, G.-J., and Warnier, M. (2014). Does Offline Political Segregation Affect the Filter Bubble? An Empirical Analysis of Information Diversity for Dutch and Turkish Twitter Users. Computers in human behavior 41, 405-415.

Bozdag, E., & van den Hoven, J. (2015). Breaking the filter bubble: Democracy and design. Ethics and Information Technology 17(4), 249–265.

Bozdag, V. E. (2015). Bursting the filter bubble: Democracy, design, and ethics (Doctoral thesis). Delft University of Technology, Delft, Netherlands.

Cadwalladr, C., and Graham-Harrison, E. (2018). Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach. The Guardian 17, 22-23.

Corner, A., Whitmarsh, L., and Xenias, D. (2012). Uncertainty, scepticism and attitudes towards climate change: Biased assimilation and attitude polarisation. Climatic Change 114(3–4), 463–478.

Costello, M., Hawdon, J., Ratliff, T., & Grantham, T. (2016). Who views online extremism? Individual attributes leading to exposure. Computers in Human Behavior 63, 311–320.

Courtois, C., Slechten, L., & Coenen, L. (2018). Challenging Google search filter bubbles in social and political information: Disconforming evidence from a digital methods case study. Telematics and Informatics 35(7), 2006–2015.

Cowling, D. (2019). Social media statistics Australia—February 2019. Retrieved from https://www.socialmedianews.com.au/social-media-statistics-australia-february-2019/

Divyaa, L. R., Tamhane, A., & Pervin, N. (2018). A clustering based social matrix factorisation technique for personalised recommender systems. Paper presented at the24th Americas conference on Information Systems, New Orleans, LA.

Dylko, I., Dolgov, I., Hoffman, W., Eckhart, N., Molina, M., & Aaziz, O. (2018). Impact of customizability technology on political polarisation. Journal of Information Technology & Politics 15(1), 19–33.

Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly 80(S1), 298–320.

Foth, M., Tomitsch, M., Forlano, L., Haeusler, M. H., & Satchell, C. (2016). Citizens breaking out of filter bubbles: Urban screens as civic media. Proceedings of the 5th ACM international symposium on pervasive displays (pp. 140–147). New York: Association for Computing Machinery.

Garrett, R. K. (2017). The ‘echo chamber’ distraction: Disinformation campaigns are the problem, not audience fragmentation. Journal of Applied Research in Memory and Cognition 6(4), 370–376.

Gregor, S., & Jones, D. (2007). The anatomy of a design theory. Journal of the Association for Information Systems 8(5), 312–335.

Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the filter bubble? Effects of personalisation on the diversity of Google News. Digital Journalism 6(3), 330–343.

Hannak, A., Sapiezynski, P., Molavi Kakhki, A., Krishnamurthy, B., Lazer, D., Mislove, A., & Wilson, C. (2013). Measuring personalisation of web search. Proceedings of the 22nd international conference on world wide web (pp. 527-538). New York: Association for Computing Machinery.

Helberger, N., Kleinen-von Königslöw, K., & van der Noll, R. (2015). Regulating the new information intermediaries as gatekeepers of information diversity. Info 17(6), 50–71.

Hutchison, P. D., Daigle, R. J., & George, B. (2018). Application of latent semantic analysis in AIS academic research. International Journal of Accounting Information Systems 31, 83–96.

Jamieson, K. H., & Cappella, J. N. (2008). Echo chamber: Rush Limbaugh and the conservative media establishment. Oxford: Oxford University Press.

Kitchenham, B. A., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering (EBSE Techinical Report). Durham, UK: University of Durham.

Lahoti, P., Garimella, K., & Gionis, A. (2018). Joint non-negative matrix factorisation for learning ideological leaning on Twitter. Proceedings of the 11th ACM international conference on web search and data mining (pp. 351–359). New York: Association for Computer Machinery.

Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., and Rothschild, D. (2018). The Science of Fake News, Science 359(6380), 1094-1096.

Liao, Q. V., & Fu, W.-T. (2013). Beyond the filter bubble: Interactive effects of perceived threat and topic involvement on selective exposure to information. Proceedings of the SIGCHI conference on human factors in computing systems (pp. 2359–2368). New York: Association of Computer Machinery.

Linder, R., Stacy, A. M., Lupfer, N., Kerne, A., & Ragan, E. D. (2018). Pop the feed filter bubble: Making Reddit social media a VR cityscape. 2018 IEEE conference on virtual reality and 3D user interfaces (VR) (pp. 619-620). Piscatawy, NJ: Institute of Electrical and Electronics Engineers.

LR, D., Tamhane, A., and Pervin, N. (2018). A Clustering Based Social Matrix Factorization Technique for Personalized Recommender Systems, Proceedings of the Twenty-fourth Americas Conference on Information Systems, New Orleans, USA.

Matakos, A., Terzi, E., & Tsaparas, P. (2017). Measuring and moderating opinion polarisation in social networks. Data Mining and Knowledge Discovery 31(5), 1480–1505.

Matt, C., Benlian, A., Hess, T., & Weiß, C. (2014). Escaping from the filter bubble? The effects of novelty and serendipity on users’ evaluations of online recommendations. Paper presented at the 35th international conference on information systems, Auckland, NZ.

Möller, J., Trilling, D., Helberger, N., & van Es, B. (2018). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. Information, Communication & Society 21(7), 959–977.

Mullainathan, S., & Washington, E. (2009). Sticking with your vote: Cognitive dissonance and political attitudes. American Economic Journal: Applied Economics 1(1), 86–111.

Müller, O., Schmiedel, T., Gorbacheva, E., & Vom Brocke, J. (2016). Towards a typology of business process management professionals: Identifying patterns of competences through latent semantic analysis. Enterprise Information Systems 10(1), 50–80.

Nagulendra, S., & Vassileva, J. (2014). Understanding and controlling the filter bubble through interactive visualisation: A user study. Proceedings of the 25th ACM conference on hypertext and social media (pp. 107–115). New York: Association for Computing Machinery.

Nagulendra, S., & Vassileva, J. (2016). Providing awareness, explanation and control of personalised filtering in a social networking site. Information Systems Frontiers 18(1), 145–158.

Nguyen, T. T., Hui, P.-M., Harper, F. M., Terveen, L., & Konstan, J. A. (2014). Exploring the filter bubble: The effect of using recommender systems on content diversity. Proceedings of the 23rd international conference on world wide web (pp. 677–686). New York: Association for Computing Machinery.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2(2), 175–220.

O’Callaghan, D., Greene, D., Conway, M., Carthy, J., & Cunningham, P. (2013). The extreme right filter bubble. arXiv preprint arXiv:1308.6149.

Paré, G., Tate, M., Johnstone, D., & Kitsiou, S. (2016). Contextualising the twin concepts of systematicity and transparency in information systems literature reviews. European Journal of Information Systems 25(6), 493–508.

Pariser, E. (2011). The filter bubble: What the internet is hiding from you. London: Penguin.

Pariser, E. 2011. “Ted Talks: Eli Pariser: Beware Online “Filter Bubbles”.” Online Video (https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles)

Postill, J. 2018. Populism and social media: A global perspective. Media, Culture & Society 40(5), 754–765.

Purtill, J. (2019). Fuelled by a toxic, alt-right echo chamber, Christchurch shooter’s views were celebrated online. ABC News. Retrieved from https://www.abc.net.au/triplej/programs/hack/christchurch-shooters-views-were-celebrated-online/10907056

Quraishi, M., Fafalios, P., & Herder, E. (2018). Viewpoint discovery and understanding in social networks. Proceedings of the 10th ACM conference on web science (pp. 47–56). New York: Association for Computing Machinery.

Qureshi, I., Bhatt, B., Gupta, S., & Tiwari, A. A. (2020). Causes, symptoms and consequences of social media induced polarisation (SMIP). Information Systems Journal. Manuscript in preparation.

Rehm, G. (2017). An infrastructure for empowering internet users to handle fake news and other online media phenomena. International conference of the German Society for Computational Linguistics and Language Technology (pp. 216–231). New York: Springer.

Resnick, P., Garrett, R. K., Kriplean, T., Munson, S. A., & Stroud, N. J. (2013). Bursting your (filter) bubble: Strategies for promoting diverse exposure. Proceedings of the 2013 conference on computer supported cooperative work companion (pp. 95–100). https://doi.org/10.1145/2441955.2441981

Ridgway, R. (2017). Against a personalisation of the self. Ephemera: Theory & Politics in Organization 17(2), 377–397.

Sanz-Cruzado, J., & Castells, P. (2018). Enhancing structural diversity in social networks by recommending weak ties. Proceedings of the 12th ACM conference on recommender systems (pp. 233–241). New York: Association for Computer Machinery.

Seargeant, P., & Tagg, C. (2018). Social media and the future of open debate: A user-oriented approach to Facebook’s filter bubble conundrum. Discourse, Context & Media 27, 41–48.

Shah, D., Koneru, P., Shah, P., & Parimi, R. (2016). News recommendations at scale at Bloomberg Media: Challenges and approaches. Proceedings of the 10th ACM conference on recommender systems (pp. 369–369). New York: Association for Computer Machinery.

Shearer, E. (2018, 10 December). Social media outpaces print newspapers in the U.S. as a news source. Pew Research Center. Retrieved from https://www.pewresearch.org/fact-tank/2018/12/10/social-media-outpaces-print-newspapers-in-the-u-s-as-a-news-source/

Spohr, D. (2017). Fake news and ideological polarisation: Filter bubbles and selective exposure on social media. Business Information Review 34(3), 150–160.

Sunstein, C. (2007). Republic. Com 2.0. Princeton, NJ: Princeton University Press.

Taramigkou, M., Bothos, E., Christidis, K., Apostolou, D., & Mentzas, G. (2013). Escape the bubble: Guided exploration of music preferences for serendipity and novelty. Proceedings of the 7th ACM conference on recommender systems (pp. 335–338). New York: Association for Computer Machinery.

The Australia Institute. (2019). ABC still Australia’s most trusted news source. Retrieved from http://www.tai.org.au/content/abc-still-australia-s-most-trusted-news-source

Thonet, T., Cabanac, G., Boughanem, M., & Pinel-Sauvagnat, K. (2017). Users are known by the company they keep: Topic models for viewpoint discovery in social networks. Proceedings of the 2017 ACM conference on information and knowledge management (pp. 87–96). New York: Association for Computer Machinery.

Valdez, A. C., Kluge, J., & Ziefle, M. (2018). Elitism, trust, opinion leadership and politics in social protests in Germany. Energy Research & Social Science, 43, 132–143.

Van den Bulck, H., & Moe, H. (2018). Public service media, universality and personalisation through algorithms: Mapping strategies and exploring dilemmas. Media, Culture & Society 40(6), 875–892.

Van Dijck, J., & Poell, T. (2013). Understanding social media logic. Media and Communication 1(1), 2–14.

Webberley, W. M., Allen, S. M., & Whitaker, R. M. (2016). Retweeting beyond expectation: Inferring interestingness in Twitter. Computer Communications 73, 229–235.

Weerasinghe, K., Pauleen, D., Scahill, S., & Taskin, N. (2018). Development of a theoretical framework to investigate alignment of big data in healthcare through a social representation lens. Australasian Journal of Information Systems 22. https://doi.org/10.3127/ajis.v22i0.1617

Winter, C. (2016). An integrated approach to Islamic State recruitment. Canberra: Australian Strategic Policy Institute.

Wood, G., Long, K., Feltwell, T., Rowland, S., Brooker, P., Mahoney, J., … Lawson, S. (2018). Rethinking engagement with online news through social and visual co-annotation. Proceedings of the 2018 CHI conference on human factors in computing systems (p. 576). New York: Association for Computer Machinery.

Woon, J. (2018). Primaries and candidate polarisation: Behavioral theory and experimental evidence. American Political Science Review 112(4), 826–843.

Yang, M., Wen, X., Lin, Y.-R., & Deng, L. (2017). Quantifying content polarisation on Twitter. 2017 IEEE 3rd international conference on collaboration and internet computing (CIC) (pp. 299–308). New York: IEEE

Downloads

Published

2021-04-06

How to Cite

Amrollahi, A. (2021). A Conceptual Tool to Eliminate Filter Bubbles in Social Networks. Australasian Journal of Information Systems, 25. https://doi.org/10.3127/ajis.v25i0.2867

Issue

Section

Selected Papers from the Australasian Conference on Information Systems (ACIS)