Submitted:
18 May 2025
Posted:
19 May 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction and Background of the Research
1.1. Digital Bangladesh and the Threat of Algorithmic Extremism
1.2. Fake News, Rumor, and the Jihadist Media Ecology
1.3. Propaganda and the Role of Social Media Algorithms
1.4. Political Climate and Institutional Vulnerabilities
2. Research Aims and Structure
- Analyze the types, origins, and impact of rumors, fake news, and disinformation related to jihadist narratives in Bangladesh during July–August 2024.
- Investigate how social media algorithms contributed to the amplification and normalization of violent extremist content.
- Examine specific case studies involving online radicalization and content virality.
- Evaluate the role of state, civil society, and tech platforms in responding to digital extremism.
- Offer policy and governance recommendations to address algorithmic threats to national security and digital citizenship.
- How were jihadist ideologies algorithmically diffused among Bangladeshi youth in mid-2024?
- What role did fake news, misinformation, and propaganda play in amplifying militant narratives?
- In what ways did social media platforms fail (or succeed) in mitigating the diffusion of extremist content?
- How did the state respond with counter-narratives, and to what extent were these effective?
2. Objectives of the Study
2.1. General Objective
2.2. Specific Objectives
2.2.1. To Identify and Categorize the Typologies of Rumors, Fake News, and Disinformation
- Rumors (spontaneous, often unverifiable claims shared in emotional or chaotic moments),
- Fake news (fabricated or manipulated stories presented as legitimate journalism), and
- Disinformation (deliberate, strategic falsehoods meant to mislead, provoke, or incite).
2.2.2. To Investigate the Relationship Between Social Media Algorithms and the Spread of Jihadist Narratives
2.2.3. To Understand the Psychological and Ideological Impact on Bangladeshi Youth
2.2.4. To Assess the Role of Platform Governance and State Response in Mitigating Disinformation and Propaganda
- Content moderation efforts,
- Transparency in algorithmic functioning, and
- Cooperation with local authorities.
2.2.5. To Explore Case Studies of Viral Misinformation and Their Real-World Effects
- Digital provocations during university protests,
- Online calls for jihad following alleged attacks on religious institutions,
- AI-generated ‘martyrdom’ videos from conflict zones manipulated for local mobilization.
2.2.6. To Propose a Framework for Digital Literacy and Policy Intervention in Bangladesh
- Digital literacy education,
- Algorithm transparency, and
- Rights-respecting counter-extremism frameworks.
2.3. Rationale for the Study Objectives
3. Significance of the Study
3.1. Academic Significance
3.2. Practical and Policy Significance
3.3. Socio-Psychological Significance
3.4. Significance for Counterterrorism and Security
- Mapping digital propaganda networks,
- Identifying patterns in disinformation-driven mobilization,
- Understanding how digital anonymity enables extremist recruitment.
3.5. Global South-Centric Knowledge Production
4. Theoretical Framework of the Study
4.1. Information Disorder Framework (Wardle & Derakhshan, 2017)
- Misinformation (false content shared without intent to harm),
- Disinformation (false content shared with deliberate intent to harm),
- Mal-information (genuine information shared maliciously).
4.2. Algorithmic Radicalization Theory (Tufekci, 2018; O’Callaghan et al., 2015)
4.3. Spiral of Silence Theory (Noelle-Neumann, 1974)
4.4. Affective Publics and Emotional Mobilization (Papacharissi, 2015; Ahmed, 2004)
4.5. Media Ecology Theory (Postman, 1970; McLuhan, 1964)
4.6. Social Identity and Radicalization Theory (Tajfel & Turner, 1979; Wiktorowicz, 2005)
4.7. Propaganda Model and Digital Adaptations (Herman & Chomsky, 1988; Marwick & Lewis, 2017)
5. Literature Review
- The Dynamics of Disinformation, Fake News, and Propaganda
- Social Media Algorithms and Digital Radicalization
- Youth, Identity Crisis, and Online Extremism
- Religious Fundamentalism and Information Warfare
- Bangladesh-Specific Studies on Extremism and Digital Violence
5.1. The Dynamics of Disinformation, Fake News, and Propaganda
5.2. Social Media Algorithms and Digital Radicalization
5.3. Youth, Identity Crisis, and Online Extremism
5.4. Religious Fundamentalism and Information Warfare
5.5. Bangladesh-Specific Studies on Extremism and Digital Violence
6. Contextual Overview: Bangladesh in July–August 2024
6.1. Political and Security Backdrop
6.2. Youth Radicalization and Digital Echo Chambers
6.3. Role of Disinformation and Fake News
6.4. Algorithmic Amplification and Platform Responsibility
6.5. Media and Governmental Response
6.6. Religious Institutions and Civil Society
6.7. The Socioeconomic Dimension
7. Young Jihadist Hostility and Algorithmic Radicalisation
7.1. Defining the ‘Young Jihadist’ in the Bangladeshi Context
7.2. The Role of Social Media Platforms in Content Curation
7.3. Algorithmic Radicalization: A Stepwise Pathway
7.4. Content Typologies Used in Radicalization
- Emotive Religious Lectures: Delivered by charismatic speakers using Bengali and Arabic, focusing on themes of Muslim pride, global victimhood, and religious duty.
- Memes and Short Videos: Easily digestible content using humor, irony, or emotionally potent images to mock secularism or glorify martyrdom.
- False Testimonies: ‘Former atheist turned Mujahid’ or ‘Brother who saw a dream from Allah’—these narratives were shared to validate the spiritual authenticity of jihad.
- Global Injustice Footage: Images of suffering Muslims worldwide, often cropped or altered to increase emotional intensity and resentment.
- Anashid (Jihadist Songs): Audio content with poetic glorification of jihad, shared widely across TikTok, YouTube Shorts, and Telegram channels.
7.5. Case Studies of Algorithmic Radicalization
7.6. Role of Bots and Coordinated Influence Operations
7.7. Platform Inaction and the Limits of Content Moderation
- Lack of Local Language Moderation: Much of the content was in Bengali or Chittagonian dialects, which were poorly supported by automated moderation tools.
- Coded Language and Symbolism: Extremist content often used symbolic language that avoided detection, such as replacing jihad with ‘journey’ or using emojis to signal intent.
- Delayed Response Times: Harmful content often stayed up for days before being removed, long enough to be downloaded, reshared, and mirrored on other platforms.
7.8. Psychological and Behavioral Outcomes
7.9. Counter-Algorithmic Strategies and Interventions
- Positive Content Seeding: NGOs created short-form videos and testimonials promoting peace, critical thinking, and pluralism, designed to mimic the style of viral jihadist content.
- Disruption Campaigns: Cyber units of the RAB (Rapid Action Battalion) and DGFI attempted to infiltrate extremist Telegram groups to monitor and disrupt activity.
- Algorithmic Auditing Tools: Independent researchers launched browser plug-ins to track how content changed based on user behavior—raising awareness among youth.
- Platform Accountability Laws: Draft legislation was introduced to mandate algorithmic transparency and localized content moderation by social media platforms.
8. Rumor, Fake News, and Disinformation Networks
8.1. Conceptual Framework: Rumor, Fake News, and Disinformation
- Rumor: Unverified information that spreads rapidly, often fueled by uncertainty or fear.
- Fake News: Fabricated content presented as legitimate news, typically designed to mislead for political or financial gain.
- Disinformation: Deliberately false information disseminated to deceive and manipulate public perception.
8.2. The Surge of Information Disorder During July–August 2024
8.3. Dominant Disinformation Narratives
- ‘Islam Under Siege’: Narratives suggesting that Muslims were being persecuted by secular or foreign-influenced entities.
- ‘False Flag Operations’: Claims that attacks were orchestrated by state actors to justify crackdowns on dissent.
- ‘Heroic Martyrdom’: Glorification of individuals involved in violent acts as martyrs defending Islam.
- ‘External Enemies’: Allegations that foreign agencies, such as India’s RAW or Western NGOs, were destabilizing Bangladesh.
- ‘Minority Infiltration’: Assertions that non-Muslims were covertly taking over key positions in government and society.
8.4. Dissemination Channels and Techniques
- Facebook: Utilized for mass dissemination via groups and pages, often employing emotionally charged content.
- WhatsApp and Telegram: Facilitated rapid, encrypted sharing of rumors within trusted networks.
- TikTok and YouTube Shorts: Employed short, visually engaging videos to appeal to younger demographics.
- YouTube and Facebook Live: Hosted longer-form content, including sermons and discussions, to provide ideological justification for disinformation.
8.5. Key Actors in Disinformation Campaigns
- Religious Influencers: Some clerics disseminated false narratives under the guise of religious teachings.
- Digital Activists: Groups coordinated online campaigns to produce and share misleading content.
- Bot Networks: Automated accounts amplified disinformation by increasing its visibility.
- Diaspora Communities: Certain overseas individuals contributed to the spread of false narratives.
- Political Entities: Some political groups utilized disinformation to undermine opponents and sway public opinion.
8.6. Case Studies of Disinformation Impact
8.7. Societal and Psychological Effects
- Erosion of Trust: Public confidence in media and governmental institutions declined.
- Heightened Sectarian Tensions: Minority communities faced increased suspicion and hostility.
- Normalization of Extremism: Radical ideologies gained mainstream acceptance among certain groups.
- Precipitation of Violence: Misinformation directly contributed to outbreaks of violence and unrest.
8.8. Institutional Responses and Challenges
- Government Initiatives: Authorities attempted to monitor and suppress false information, though these actions were sometimes perceived as censorship.
- Fact-Checking Organizations: Entities like Rumor Scanner and Dismislab intensified verification efforts, albeit with limited reach.
- Platform Moderation: Social media companies faced criticism for inadequate content moderation, particularly in non-English contexts (Time, 2024).
8.9. Recommendations for Mitigation
- Enhancing Digital Literacy: Educational programs to improve critical thinking and media literacy among the populace.
- Strengthening Fact-Checking Networks: Support for independent verification organizations to expand their capabilities.
- Regulatory Frameworks: Development of policies to hold platforms accountable for content moderation.
- Community Engagement: Involving local leaders and influencers in promoting accurate information.
9. Propaganda and the Algorithmic Manipulation of Truth
9.1. Defining Propaganda in the Digital Era
9.2. The Political Economy of Algorithmic Propaganda
9.3. Structures of Algorithmic Manipulation
9.3.1. Engagement-Based Prioritization
9.3.2. Coordinated Inauthentic Behavior
9.3.3. Algorithmic Down-ranking of Dissent
9.4.4. Visual Propaganda Optimization
9.5. Narrative Engineering and Ideological Packaging
9.5.1. The Martyrdom Mythos
9.5.2. The ‘Oppressor State’ Imaginary
9.5.3. The Global Muslim Victimhood Nexus
9.6. Local and Transnational Propaganda Ecosystems
9.7. Case Studies of Algorithmic Propaganda
9.8. Psychological Effects of Algorithmic Propaganda
- Epistemic Confusion: Many citizens reported confusion over what was true or false, resulting in decision paralysis or radicalization (Lewandowsky, Ecker, & Cook, 2017).
- Emotional Saturation: Constant exposure to emotionally manipulative content led to desensitization or overreaction.
- Tribalism: Users were algorithmically steered into ideological silos, reinforcing extreme views and distrust of the ‘other.’
9.9. Institutional Responses and Ethical Dilemmas
9.9.1. Government Actions
9.9.2. Platform Responses
9.9.3. Civil Society and Fact-Checking
9.10. Recommendations
- Algorithmic Accountability: Platforms must disclose how algorithms prioritize content and must adjust parameters to downrank harmful or false narratives (Gillespie, 2018).
- Localized Moderation: Employing more content moderators fluent in Bangla and culturally aware of regional nuances is crucial.
- Counter-Narrative Campaigns: Civil society must actively produce emotionally resonant, fact-based counter-narratives that can compete within algorithmic ecosystems.
- Media Literacy Education: Especially among youth, digital literacy programs should focus on recognizing propaganda and algorithmic manipulation.
10. Case Studies: Viral Disinformation and Jihadist Mobilization
10.2. Case Study 1: ‘Hijab Girl Martyred’ – TikTok as a Tool of Emotional Mobilization
10.2.1. The Incident
10.2.2. The Disinformation Dynamic
10.2.3. Mobilization Outcomes
10.3. Case Study 2: ‘Quran Burnt in Police Raid’ – The Weaponization of Religious Sentiment
10.3.1. The Incident
10.3.2. Narrative Engineering and Viral Spread
10.3.3. Algorithmic Impact and Mobilization
10.4. Case Study 3: ‘Digital Shaheed List’ – Telegram and the Creation of Martyrdom
10.4.1. Psychological Warfare and Social Proof
10.4.2. Mobilization Effect
10.5. Case Study 4: ‘The Foreign Enemy’ – Anti-Rohingya Disinformation and Strategic Diversion
10.5.1. Context and Content
10.5.2. Strategic Use of Disinformation
10.5.3. Counterproductive Consequences
10.6. Analysis: Common Patterns Across Case Studies
10.6.1. Emotional Triggers and Religious Symbolism
10.6.2. Cross-Platform Synergy
10.6.3. Fact-Checker Deficit
10.6.4. Youth-Centric Targeting
10.7. Countermeasures and Ethical Challenges
10.7.1. Government Response
10.7.2. Platform Responsibilities
10.7.3. Ethical Counter-Disinformation
11. The Role of Social Media Platforms and Data Capitalism
11.1. Platform Design and Algorithmic Amplification
11.1.1. The Business Model of Engagement
11.2.2. Platform Inertia and Delayed Moderation
11.3. Algorithmic Radicalization
11.3.1. Recommendation Engines and Ideological Funnels
11.3.2. The Role of Auto-Play and Infinite Scroll
11.4. Monetizing Extremism: Data Capitalism and Crisis
11.4.1. Exploiting User Data for Targeted Content
11.4.2. Advertising Infrastructure and Political Monetization
11.5. Platform Governance Failures
11.5.1. Inadequate Localization and Language Support
11.5.2. Political Economy of Censorship
11.6. The Rise of Encrypted and Alternative Platforms
11.6.1. Telegram and the Shadow Internet
11.6.2. Bypassing Moderation via Alternate Apps
11.7. Platform Responses and Their Limitations
11.7.1. Content Removal and AI Moderation
11.7.2. Tokenistic Community Guidelines and Trust & Safety Programs
11.8. The Political Economy of Surveillance and Resistance
11.8.1. Surveillance Capitalism and Data Extraction
11.8.2. Digital Colonialism and Platform Power
12. State, Surveillance, and Counter-Narratives
12.1. The State’s Surveillance Infrastructure
12.1.1. Expansion of Digital Surveillance Capabilities
12.1.2. Predictive Policing and Algorithmic Profiling
12.3. Social Media Governance and State Control
12.3.1. Collaboration and Coercion of Platforms
12.3.2. Internet Shutdowns and Communication Blackouts
12.4. Counter-Narratives: Strategy and Execution
12.4.1. State-Run Counter-Narrative Programs
12.4.2. Civil Society and Non-State Actors
12.5. Challenges and Criticisms
12.5.1. The Problem of Overreach and False Positives
12.5.2. Lack of Transparency and Oversight
12.6. Toward a Democratic Digital Security Framework
12.6.1. Rights-Based Digital Governance
12.6.2. Participatory Counter-Narrative Ecosystems
13. Policy Analysis and Digital Governance in Bangladesh
13.2. Overview of Digital Governance Framework in Bangladesh
13.2.1. Legal Instruments Governing the Digital Sphere
13.2.2. Institutional Architecture and Enforcement Agencies
13.3. The Role of Social Media Platforms and Corporate Governance
13.3.1. Platform Compliance with State Regulations
13.3.2. Algorithmic Amplification and Responsibility
13.4. Challenges in Digital Governance: Balancing Security, Rights, and Innovation
13.4.1. Security vs. Freedom of Expression
13.4.2. Capacity Constraints and Digital Literacy Gaps
13.4.3. The Digital Divide and Inclusion
13.5. Emerging Trends and Policy Innovations
13.5.1. Multi-Stakeholder Governance Approaches
13.5.2. Data Privacy and Protection Initiatives
13.5.3. Digital Literacy and Counter-Misinformation Campaigns
13.6. Policy Recommendations
- Reform the Digital Security Act to clarify definitions, incorporate human rights safeguards, and establish independent oversight mechanisms to prevent misuse.
- Enhance transparency and accountability in content moderation by requiring social media platforms to publish localized transparency reports and establish accessible appeal processes.
- Invest in capacity-building for law enforcement, judiciary, and regulators, with emphasis on technical expertise and digital evidence management.
- Promote multi-stakeholder dialogue involving government, civil society, academia, and private sector actors to co-create governance frameworks that reflect diverse interests and expertise.
- Advance comprehensive data protection legislation that safeguards user privacy and regulates state surveillance within constitutional bounds.
- Expand digital literacy programs, targeting marginalized communities and integrating critical media skills into formal education.
- Foster inclusive governance by ensuring marginalized voices are heard in policy-making processes, thereby enhancing social cohesion and resilience.
14. Discussion
14.1. The Interplay Between Digital Media and Jihadist Hostility
14.2. Algorithmic Manipulation and Platform Governance
14.3. The Role of State Surveillance and Counter-Narratives
14.4. Implications for Digital Governance and Policy
14.5. Societal and Psychological Dimensions
14.6. Limitations and Areas for Future Research
15. Conclusions and Recommendations
15.1. Conclusions
15.2. Recommendations
References
- Access Now. (2024). Internet shutdowns in Bangladesh during July–August 2024. Available online: https://www.accessnow.org.
- Ahmed, F.; Rahman, M. Surveillance technologies in South Asia: A comparative study of state capacities. Asian Journal of Digital Governance 2022, 5, 55–78. [Google Scholar]
- Ahmed, S. Digital governance and law enforcement in Bangladesh: Challenges and prospects. Bangladesh Journal of Cybersecurity 2023, 4, 45–63. [Google Scholar]
- Ahmed, S. (2023). Memory and the Making of Political Islam in Bangladesh. Dhaka: UPL.
- Akhter, S. Refugee scapegoating and the politics of fear in South Asia. Contemporary South Asia 2021, 29, 312–326. [Google Scholar]
- Amnesty International. (2023). Bangladesh: Digital security law used to silence dissent. Available online: https://www.amnesty.org/en/latest/news/2023/02/bangladesh-digital-security-law.
- Arnaudo, D. (2017). Computational propaganda in Brazil: Social bots during elections. Oxford Internet Institute Working Paper Series.
- Awan, I. Cyber-extremism: Isis and the power of social media. Social Science and Public Policy 2017, 54, 138–149. [Google Scholar] [CrossRef]
- BBS. (2024). Statistical Yearbook of Bangladesh 2023. Bangladesh Bureau of Statistics.
- Bradshaw, S.; Howard, P.N. (2019). The global disinformation order: 2019 global inventory of organized social media manipulation. Oxford Internet Institute.
- Bradshaw, S.; Neudert, L.-M.; Howard, P.N. (2018). Government responses to organized disinformation campaigns. Council of Europe Report.
- Chakraborty, S.; Rahman, T. Digital authoritarianism in South Asia: Emerging patterns and resistance. Global Digital Rights Journal 2023, 7, 22–39. [Google Scholar]
- Chowdhury, M. Social media moderation and state influence in Bangladesh. Journal of Media Studies 2024, 11, 112–130. [Google Scholar]
- Cinelli, M.; et al. The echo chamber effect on social media. PNAS 2021, 118. [Google Scholar]
- Cinelli, M.; Morales, G.D.F.; Galeazzi, A.; Quattrociocchi, W.; Starnini, M. The echo chamber effect on social media. Proceedings of the National Academy of Sciences 2021, 118, e2023301118. [Google Scholar] [CrossRef]
- Cinelli, M.; Quattrociocchi, W.; Galeazzi, A.; Valensise, C.M.; Brugnoli, E.; Schmidt, A.L.; Zola, P.; Zollo, F.; Scala, A. The COVID-19 social media infodemic. Scientific Reports 2021, 10, 16598. [Google Scholar] [CrossRef]
- Couldry, N.; Mejias, U.A. (2019). The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism. Stanford University Press.
- Daily Star. (2024, August 5). Teenage meme-maker arrested under DSA. The Daily Star. Available online: https://www.thedailystar.net.
- Dismislab. (2024). How disinformation stoked fear, panic and confusion during Bangladesh’s student-led revolution. Available online: https://en.dismislab.com/how-disinformation-stoked-fear-panic-and-confusion-during-bangladeshs-student-led-revolution/.
- Dismislab. (2024). Predictive Policing and Digital Profiling in Bangladesh: A Human Rights Assessment. Available online: https://www.dismislab.org.
- Douek, E. Content moderation as systems thinking. Harvard Journal of Law & Technology 2020, 33, 1–56. [Google Scholar]
- Feldstein, S.; (2019). The global expansion of AI surveillance. Carnegie Endowment for International Peace. Available online: https://carnegieendowment.org.
- Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
- Gorwa, R.; Guilbeault, D. Unpacking the social media bot: A typology to guide research and policy. Policy & Internet 2020, 12, 225–248. [Google Scholar]
- Guess, A.M.; Nagler, J.; Tucker, J.A. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances 2019, 5, eaau4586. [Google Scholar] [CrossRef]
- Human Rights Watch. (2024). Internet Shutdowns and Information Suppression in Bangladesh’s 2024 Crisis. Available online: https://www.hrw.org.
- Human Rights Watch. (2023). Bangladesh: Free speech under threat. Available online: https://www.hrw.org/report/2023/01/bangladesh-free-speech.
- Islam, M.; Chowdhury, R. Digital literacy challenges in rural Bangladesh. International Journal of Media and Communication Studies 2023, 8, 23–40. [Google Scholar]
- Islam, M.; Chowdhury, R. Digital literacy challenges in rural Bangladesh. International Journal of Media and Communication Studies 2023, 8, 23–40. [Google Scholar]
- Islam, R. Digital Security Act in Bangladesh: Tool for security or suppression? South Asian Law Review 2021, 6, 103–119. [Google Scholar]
- Jowett, G.S.; O’Donnell, V. (2018). Propaganda & Persuasion (7th ed.). SAGE Publications.
- Khan, S.; Siddiqui, A. Multi-stakeholder digital governance in South Asia: Lessons for Bangladesh. Asian Policy Review 2024, 6, 70–89. [Google Scholar]
- Khan, S.; Siddiqui, A. Multi-stakeholder digital governance in South Asia: Lessons for Bangladesh. Asian Policy Review 2024, 6, 70–89. [Google Scholar]
- Khan, S.; Siddiqui, A. Multi-stakeholder digital governance in South Asia: Lessons for Bangladesh. Asian Policy Review 2024, 6, 70–89. [Google Scholar]
- Lewandowsky, S.; Ecker, U.K.; Cook, J. Beyond misinformation: Understanding and coping with the ‘post-truth’ era. Journal of Applied Research in Memory and Cognition 2017, 6, 353–369. [Google Scholar] [CrossRef]
- MacKinnon, R. (2012). Consent of the networked: The worldwide struggle for internet freedom. Basic Books.
- Marwick, A.; Lewis, R.; (2017). Media manipulation and disinformation online. Data & Society Research Institute. Available online: https://datasociety.net/library/media-manipulation-and-disinfo-online/.
- Meta Transparency Report. (2024). Content removal requests in Bangladesh. Available online: https://transparency.fb.com/reports/2024/bangladesh.
- NetBlocks. (2024). Tracking digital disruptions in Bangladesh. Available online: https://netblocks.org.
- Neumann, P.R. The trouble with radicalization. International Affairs 2013, 89, 873–893. [Google Scholar] [CrossRef]
- Pennycook, G.; Rand, D.G. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 2018, 188, 39–50. [Google Scholar] [CrossRef] [PubMed]
- Rahim, N.; Hossain, J. Data protection policy landscape in Bangladesh: Emerging frameworks and challenges. Journal of Information Policy 2024, 14, 101–121. [Google Scholar]
- Rahman, T.; Alam, F. The Digital Security Act 2018 and implications for freedom of expression in Bangladesh. South Asian Journal of Law and Policy 2020, 2, 15–38. [Google Scholar]
- Reuters. (2024). Photo does not show protesters in Bangladeshi presidential residence. Available online: https://www.reuters.com/fact-check/photo-does-not-show-protesters-bangladeshi-presidential-residence-2024-08-09/.
- Ribeiro, M.H.; Ottoni, R.; West, R.; Almeida, V.A.; Meira, W. (2020). Auditing radicalization pathways on YouTube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 131–141.
- Rumor Scanner. (2024). Disinformation Tracking Reports: July–August 2024. Available online: https://rumorscanner.com.
- Rumor Scanner. (2024). Fact-checking archive: July–August disinformation. Available online: https://rumorscanner.com.
- Rumor Scanner. (2025). A Record-Breaking 2,919 Cases of Misinformation Identified. Available online: https://rumorscanner.com/en/statistics-2/rumors-data-2024/134624.
- Srnicek, N. (2017). Platform capitalism. Polity Press.
- Stern, J.; Berger, J.M. (2015). ISIS: The state of terror. HarperCollins.
- Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media. Princeton University Press.
- Time. (2024). Exclusive: Tech Companies Are Failing to Keep Elections Safe, Rights Groups Say. Available online: https://time.com/6967334.
- Time. (2024). Exclusive: Tech Companies Are Failing to Keep Elections Safe, Rights Groups Say. Available online: https://time.com/6967334/ai-elections-disinformation-meta-tiktok/.
- Time. (2024). South Asia’s disinformation dilemma: Social media under fire. Available online: https://time.com/6967334.
- Tufekci, Z. Algorithmic harms beyond Facebook and Google: Emergent challenges of computational agency. Colorado Technology Law Journal 2015, 13, 203–218. [Google Scholar]
- Tufekci, Z.; (2018). YouTube, the great radicalizer. The New York Times. Available online: https://www.nytimes.com.
- Tufekci, Z. (2019). Twitter and tear gas: The power and fragility of networked protest. Yale University Press.
- Ullah, A.; Ahmed, S. Content moderation in South Asia: The blind spots of AI and platform policy. Digital Rights South Asia Journal 2022, 4, 45–67. [Google Scholar]
- UNDP. (2024). Youth, Peace and Countering Violent Extremism in Bangladesh: Mid-Year Review. United Nations Development Programme. Available online: https://www.undp.org.
- UNICEF Bangladesh. (2024). Digital literacy and youth empowerment programs in Bangladesh. Available online: https://www.unicef.org/bangladesh/reports/digital-literacy.
- Wardle, C.; Derakhshan, H.; (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe. Available online: https://rm.coe.int/information-disorder-report-february-2017/1680764666.
- Weimann, G. (2015). Terrorism in cyberspace: The next generation. Columbia University Press.
- Wikipedia. (2024). Chalaiden. Available online: https://en.wikipedia.org/wiki/Chalaiden.
- Winter, C. (2015). Documenting the Virtual ‘Caliphate’. Quilliam Foundation.
- World Bank. (2023). Bangladesh digital economy assessment: Bridging the divide. Available online: https://documents.worldbank.org/en/publication/documents-reports.
- Zaman, A. Counterterrorism and digital rights in Bangladesh: Striking a balance. Journal of Peace and Security Studies 2024, 5, 33–56. [Google Scholar]
- Zannettou, S.; Sirivianos, M.; Blackburn, J.; Kourtellis, N. The web of false information: Rumors, fake news, hoaxes, clickbait, and various other shenanigans. Journal of Data and Information Quality 2018, 11, 1–37. [Google Scholar] [CrossRef]
- Zannettou, S.; Sirivianos, M.; Blackburn, J.; Kourtellis, N. The web of false information: Rumors, fake news, hoaxes, clickbait, and various other shenanigans. Journal of Data and Information Quality 2018, 11, 1–37. [Google Scholar] [CrossRef]
- Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).