Preprint
Review

This version is not peer-reviewed.

The Two-Step Flow Theory in the Digital Age (2005–2025): An Analytical Literature Review

Submitted:

03 May 2025

Posted:

06 May 2025

You are already at the latest version

Abstract
The Two-Step Flow (TSF) theory, developed in the mid-20th century, posits that mass media influence is mediated by opinion leaders who interpret and relay messages to wider audiences. This literature review synthesizes and critically analyzes approximately 60 studies published between 2005 and 2025, exploring the relevance, evolution, and limitations of TSF within the digital media ecosystem. The review evaluates the reconceptualization of opinion leadership (influencers, micro-celebrities, and networked individuals), the transformation of influence pathways (multi-step, networked, and algorithmic flows), and TSF’s application across political communication, health, marketing, and misinformation. While digital media's interactivity, user-generated content, and algorithmic curation challenge the original TSF model, key concepts such as mediated influence and the importance of interpersonal networks persist. The findings suggest that TSF’s enduring value lies in its foundational insight into social mediation, though future research must incorporate algorithmic influence, cross-platform dynamics, and the heterogeneity of digital opinion leadership. The review concludes with a critical discussion and actionable recommendations for future research.
Keywords: 
;  ;  ;  ;  ;  ;  ;  ;  ;  ;  

1. Introduction

1.1. Background and Rationale

The past two decades have witnessed a seismic transformation in the communication landscape, driven primarily by the proliferation of the internet, social media, and digital platforms (Baym, 2010; Rainie & Wellman, 2012; van Dijck, 2013). Foundational communication theories, developed in an era dominated by traditional mass media, now face renewed scrutiny as researchers assess their applicability in a complex, multi-platform world. Among these theories, the Two-Step Flow (TSF) theory, first proposed by Lazarsfeld, Berelson, and Gaudet (1944) and later elaborated by Katz and Lazarsfeld (1955), remains one of the most influential frameworks in media studies. The TSF model originally challenged the direct-effects paradigm—epitomized by the "hypodermic needle" or "magic bullet" theories—by arguing that media effects are largely mediated by "opinion leaders." These individuals, more attentive to media and trusted within their social circles, interpret and relay media messages to less engaged peers, shaping public opinion through interpersonal networks (Katz, 1957; Katz & Lazarsfeld, 1955). However, the contemporary digital media environment is characterized by ubiquitous connectivity, interactivity, user-generated content, networked publics, algorithmic curation, and the blurring of content creators and consumers (Bruns, 2008; Bucher, 2017; Marwick & boyd, 2011). These features raise fundamental questions about the relevance and limitations of the original TSF model:
  • Do opinion leaders still play a central role in mediating media effects?
  • How have digital affordances, such as algorithmic gatekeeping and platform architectures, restructured information flows?
  • What adaptations or alternative models are necessary to understand influence in the digital age?

1.2. Objectives and Structure

This literature review aims to:
  • Synthesize empirical research from 2005–2025 assessing the TSF theory in digital environments.
  • Analyze the changing nature of opinion leadership and the structure of influence in digital media.
  • Examine the application and testing of TSF in specific domains: politics, health, marketing, and misinformation.
  • Critically evaluate the limitations of TSF and review complementary or alternative models.
  • Provide a nuanced discussion and recommendations for future research.
The review is organized as follows: Section 2 revisits the original TSF theory and early critiques. Section 3 explores the evolution of TSF in the digital media ecosystem. Section 4 reviews domain-specific applications. Section 5 critically discusses TSF’s limitations and alternative models. Section 6 synthesizes findings and future research directions. Section 7 presents a critical discussion, and Section 8 offers recommendations. Section 9 provides a complete, alphabetically sorted reference list.

2. The Original Two-Step Flow Theory and Early Critiques

2.1. Foundations of TSF

The TSF theory emerged from Lazarsfeld, Berelson, and Gaudet’s (1944) Erie County study on voting behavior, which found that interpersonal communication from opinion leaders had a greater effect on voting decisions than direct media exposure. This overturned the prevailing belief in powerful, uniform media effects (Bauer, 1964). Katz and Lazarsfeld (1955) expanded upon these findings in their Decatur study, emphasizing that opinion leaders—typically more media-engaged and socially active—act as intermediaries, filtering, interpreting, and legitimizing mass media messages for their peers. Katz (1957) further articulated the model’s core tenets, highlighting the distinction between active opinion leaders and passive followers, the primacy of interpersonal communication, and selective exposure to media content.

2.2. Early Critiques of TSF

Despite its enduring influence, TSF faced several critiques even in the pre-digital era:
  • Oversimplification: Critics argued that communication flows are more complex than a simple two-step process, often involving multi-step, one-step, or networked flows (Robinson, 1976; Troldahl, 1966; Van den Ban, 1964).
  • Fluidity of Opinion Leadership: The leader-follower dichotomy was seen as artificial and context-dependent; individuals could be leaders in one domain and followers in another (Lin, 1971).
  • Underestimation of Direct Media Effects: The model was critiqued for downplaying the direct effects of media, especially in agenda-setting and awareness (McCombs & Shaw, 1972).
  • Active Audiences: The portrayal of opinion followers as passive recipients was challenged by research emphasizing audience agency and independent interpretation (Bauer, 1964).
Nevertheless, TSF shifted the focus of media effects research toward recognizing social context and interpersonal influence, laying the groundwork for later network perspectives (Weimann, 1982).

3. The Two-Step Flow Theory in the Digital Media Ecosystem

3.1. The Evolution of Opinion Leadership

Table 1. Evolution of Opinion Leadership in the Digital Age.
Table 1. Evolution of Opinion Leadership in the Digital Age.
Preprints 158227 i001

3.1.1. Traditional Opinion Leaders in Digital Spaces

Despite technological and social changes, traditional opinion leaders—such as journalists, politicians, and recognized experts—continue to exert influence online. These individuals often leverage platforms like Twitter and Facebook to disseminate their perspectives, maintain authority, and reach wide audiences (Bruns & Burgess, 2011; Parmelee, 2014). Political journalists, for example, frequently act as opinion leaders for both the public and other media professionals during elections (Parmelee, 2014).

3.1.2. The Rise of Digital Influencers

The digital era has witnessed the emergence of new types of opinion leaders: digital influencers, micro-celebrities, and networked individuals who gain prominence primarily through social media platforms (Abidin, 2016; Khamis et al., 2017). These influencers, often self-branded and operating across diverse niches (e.g., beauty, gaming, activism), shape consumer behavior, health decisions, and even political attitudes (Casaló et al., 2020; Lou & Yuan, 2019; Sokolova & Kefi, 2020). Unlike traditional leaders, digital influencers build trust and influence through perceived authenticity, relatability, and parasocial relationships (Jin et al., 2019; Lee & Watkins, 2016). Micro- and nano-influencers—with smaller but highly engaged followings—often wield disproportionate influence within specific communities due to their credibility and network position (Campbell & Farrell, 2020; Gökalp et al., 2022; Weeks et al., 2017).

3.1.3. Networked and Algorithmic Leadership

Advancements in network analysis have allowed researchers to empirically identify influential nodes in digital conversations, moving beyond self-reported leadership (Gonzalez-Bailon et al., 2011; Himelboim et al., 2012). Influence online is increasingly understood as dynamic, context-specific, and structurally defined by network centrality (Choi, 2015; Watts & Dodds, 2007).
A key contemporary development is the role of algorithms in amplifying certain voices and content. Algorithms, acting as non-human agents, shape visibility and confer influence based on engagement metrics rather than expertise, fundamentally altering the logic of opinion leadership (Bucher, 2017; Noble, 2018; Cotter et al., 2022).

3.2. Transformation of Influence Pathways

Table 2. Transformation of Influence Pathways in the Digital Age.
Table 2. Transformation of Influence Pathways in the Digital Age.
Preprints 158227 i002
Preprints 158227 i003

3.2.1. Multi-Step and Networked Flows

Digital environments facilitate multi-step information flows, where content is shared and reshaped through multiple intermediaries before reaching broader audiences (Bakshy et al., 2012; Goel et al., 2012; Hansen et al., 2011). Retweeting and sharing on platforms like Twitter and Facebook exemplify this cascading process, with information potentially being modified at each step.

3.2.2. One-Step and Direct Flows

Paradoxically, digital media also enables more direct, one-step flows. Individuals can access a vast array of information sources directly, bypassing traditional intermediaries (Bennett & Manheim, 2006; Flaxman et al., 2016). Personalization algorithms and search engines deliver tailored content, potentially reducing the mediating role of opinion leaders—though algorithmic mediation itself becomes a new form of influence (Bucher, 2017).

3.2.3. Networked Flow and the Strength of Weak Ties

Many scholars argue that networked influence models better capture digital realities, emphasizing the interplay between mass media, social networks, and individual expression (Castells, 2009; Rainie & Wellman, 2012). Information spreads rapidly through both strong and weak ties; Granovetter’s (1973) "strength of weak ties" concept is especially relevant as digital networks facilitate diffusion across diverse groups (Centola & Macy, 2007; Valenzuela et al., 2018).

3.2.4. Echo Chambers and Filter Bubbles

Algorithmic curation and homophilous sorting contribute to the formation of echo chambers and filter bubbles, where individuals are exposed primarily to reinforcing viewpoints (Barberá et al., 2015; Pariser, 2011; Sunstein, 2017). This phenomenon complicates TSF by suggesting that influence may operate powerfully within fragmented, homogenous publics.

3.3. Platform Architecture and Affordances

Platform-specific features—such as visibility metrics, sharing mechanisms, algorithmic gatekeeping, and context collapse—actively structure communication flows and influence dynamics (Marwick & boyd, 2011; Gillespie, 2014; Napoli, 2014):
  • Visibility and Social Metrics: Likes, shares, and follower counts provide visible social cues, amplifying perceived influence (Haim et al., 2018; van Dijck, 2013).
  • Sharing Mechanisms: Features like retweets and shares accelerate diffusion and facilitate viral cascades (Guille et al., 2013; Kwak et al., 2010).
  • Algorithmic Gatekeeping: Algorithms prioritize content based on engagement and user history, often amplifying certain voices while marginalizing others (Cotter et al., 2022).
  • Context Collapse: Blurring of social contexts means messages intended for one group may reach unintended audiences, complicating targeted influence (Marwick & boyd, 2011).

4. Application and Testing of TSF in Digital Contexts

Table 3. TSF Application and Challenges in Key Digital Domains.
Table 3. TSF Application and Challenges in Key Digital Domains.
Preprints 158227 i004
Preprints 158227 i005

4.1. Political Communication

TSF’s original context was politics. In the digital era, network analysis consistently identifies influential users shaping online political discourse (An et al., 2014; Conway et al., 2015; Himelboim et al., 2012). Selective exposure and partisan echo chambers reinforce the role of in-group opinion leaders (Bakshy et al., 2015; Iyengar & Hahn, 2009), while political campaigns increasingly target online influencers to mobilize supporters (Bimber & Davis, 2003; Gibson & Cantijoch, 2013; Vaccari & Valeriani, 2015). However, empirical studies highlight the complexity beyond TSF. Meraz (2009) documents multi-step flows in the political blogosphere, while Weeks et al. (2017) find hybrid models integrating traditional, digital, and interpersonal sources.

4.2. Health Communication

Health communication research identifies diverse online opinion leaders, from medical professionals to patient advocates (Chen et al., 2018; Hesse et al., 2005; Ho et al., 2021). Online health communities foster peer influence, and health organizations increasingly leverage influencers to disseminate public health messages (Eysenbach et al., 2004; Kite et al., 2016; Thackeray et al., 2012).
The spread of health misinformation, however, highlights the risks of mediated influence when sources are unreliable (Burki, 2019; Johnson et al., 2020; Kata, 2010).

4.3. Marketing and Consumer Behavior

Marketers have embraced TSF principles, partnering with social media influencers to shape consumer attitudes (De Veirman et al., 2017; Hughes et al., 2019; Lou & Yuan, 2019). Electronic word-of-mouth (eWOM) and user reviews function as digital word-of-mouth, with consumers acting as opinion leaders (Cheung & Thadani, 2012; Hennig-Thurau et al., 2004). Online brand communities further enable peer-to-peer influence (Casaló et al., 2010; Laroche et al., 2012).

4.4. Misinformation and Disinformation

TSF concepts illuminate the spread of misinformation online, where a small number of influential users (including bots and highly active accounts) amplify false content (Grinberg et al., 2019; Shao et al., 2018; Starbird et al., 2014). Trust in the source, rather than message veracity, often drives sharing (Turcotte et al., 2015). However, algorithmic amplification, emotional resonance, and coordinated disinformation campaigns complicate the TSF framework (Benkler et al., 2018; Lazer et al., 2018; Vosoughi et al., 2018).

5. Critiques, Limitations, and Alternative Models

5.1. Critiques and Limitations of TSF in the Digital Age

Despite its adaptability, TSF faces substantial limitations in accounting for contemporary digital communication:
  • Oversimplification: The linear, two-step model is inadequate for describing multi-directional, networked, and algorithmically mediated information flows (Bennett & Segerberg, 2012; Castells, 2009).
  • Fluid and Ephemeral Leadership: Online opinion leadership is highly contextual, transient, and often driven by algorithmic visibility rather than inherent expertise (boyd, 2010; Turcotte et al., 2015).
  • Active Audiences and Direct Media Effects: Digital audiences actively seek, interpret, remix, and produce content, challenging the notion of passive followers and mediated influence (Jenkins, 2006; Livingstone, 2004).
  • Algorithmic Mediation: TSF does not account for the powerful role of platform algorithms in shaping exposure, visibility, and influence (Bucher, 2017; Gillespie, 2014; Noble, 2018).
  • Beyond Persuasion: Digital communication serves functions beyond persuasion, including community-building, identity expression, and deliberation, which are not addressed by TSF (Baym, 2010; Papacharissi, 2010).
  • Online-Offline Nexus: Influence operates across digital and offline contexts, with complex feedback loops not captured by the original model (Couldry & Hepp, 2017; Wellman, 2001).
  • Trust and Authenticity: The basis of trust in digital opinion leadership (parasocial relationships, algorithmic amplification) differs from face-to-face trust envisioned in TSF (Dubois et al., 2020; Marwick, 2015).

5.2. Alternative and Complementary Models

Table 4. Comparison of TSF with Alternative/Complementary Models.
Table 4. Comparison of TSF with Alternative/Complementary Models.
Preprints 158227 i006
Preprints 158227 i007
  • Networked Influence Models: These models employ network science to analyze how structure and individual attributes interact to facilitate diffusion and cascades (Aral & Walker, 2012; Bakshy et al., 2012; Watts & Dodds, 2007).
  • Diffusion of Innovations: Rogers’ (2003) diffusion model offers a process-oriented perspective, describing how innovations spread through social systems, involving multiple adopter categories and communication channels.
  • Social Identity Model of Deindividuation Effects (SIDE): This framework explains how group identity and anonymity shape behavior and influence in digital environments (Postmes et al., 1998; Spears & Lea, 1994).
  • Algorithmic Influence Frameworks: These models explore how algorithms mediate content exposure, confer visibility, and interact with social dynamics (Bucher, 2017; Cotter et al., 2022; Gillespie, 2014).
  • Logic of Connective Action: Bennett and Segerberg (2012) propose that large-scale digital mobilization often bypasses traditional leaders via personalized, networked communication flows.
  • Hybrid Models: Recent scholarship advocates for integrating TSF, network analysis, diffusion theory, and algorithmic studies to capture the interplay of social, structural, and technological factors (Hilbert et al., 2017; Weeks et al., 2017).

6. Synthesis and Future Research Directions

6.1. Synthesis of Findings

The literature reviewed demonstrates that while the original, linear TSF model is insufficient for the digital era, its foundational insight—that media effects are socially mediated—remains salient. The digital ecosystem has fundamentally transformed the mechanisms and pathways of influence, introducing new actors (digital influencers, micro-celebrities, algorithms), new structures (networked flows, platform architectures), and new complexities (algorithmic mediation, cross-platform diffusion). Core elements of TSF—mediated influence, the importance of interpersonal networks, and the role of trust—continue to underpin research across politics, health, marketing, and the spread of misinformation. However, the processes are now characterized by:
  • Fragmentation and diversification of opinion leadership.
  • Complex, multi-step, and networked information flows.
  • Centrality of platform affordances and algorithmic gatekeeping.
  • Contextual variation across issues, platforms, and cultural settings.
  • The need for hybrid, integrative models that reflect the interplay of human, social, and technological factors.

6.2. Future Research Directions

The literature suggests several avenues for future inquiry:
  • Integrated Models: Develop robust models integrating human agency, network structure, platform architecture, and content characteristics (Hilbert et al., 2017).
  • Algorithmic Mediation: Investigate the role of algorithms as mediators, including their effects on opinion leadership, trust, and public perception (Bucher, 2017; Noble, 2018; Yeo et al., 2021).
  • Cross-Platform Dynamics: Analyze how influence flows across multiple platforms and how platform ecosystems collectively shape public discourse (Chadwick et al., 2021).
  • Longitudinal Analysis: Conduct longitudinal studies to understand the evolution of influence networks and the dynamics of opinion leadership over time (Aral & Dhillon, 2018).
  • Online-Offline Interactions: Further explore the interplay between online and offline influence, including the translation of digital authority to real-world impact (Couldry & Hepp, 2017; Vaccari & Valeriani, 2016).
  • Nuanced Leadership Typologies: Examine the diversity, motivations, and mechanisms of digital opinion leadership, moving beyond monolithic conceptions of "influencers" (Abidin, 2016; Dubois et al., 2020).
  • Comparative and Global Research: Expand research beyond Western contexts to explore the applicability of TSF and related models globally (Valeriani & Vaccari, 2018).
  • Influence in Malign Contexts: Investigate the role of human and algorithmic mediation in the spread of misinformation, hate speech, and polarization, developing targeted interventions (Benkler et al., 2018; Johnson et al., 2020).

7. Discussion

7.1. Enduring Relevance and Evolution of TSF

The Two-Step Flow theory’s core insight—the social mediation of media effects—remains remarkably relevant, even as the digital media landscape has fundamentally altered its mechanisms. Influential intermediaries, whether traditional elites, digital influencers, or algorithmically amplified accounts, continue to shape the flow of information and public opinion. However, the identity, stability, and basis of opinion leadership have become increasingly fragmented, situational, and performative (Marwick & boyd, 2011; Watts & Dodds, 2007).
The linearity of the original TSF model has given way to complex, multi-step, and networked flows, with information traveling through diverse pathways shaped by network structures, platform affordances, and algorithmic logics (Bakshy et al., 2012; Castells, 2009). Echo chambers and filter bubbles further complicate the dynamics of influence, potentially reinforcing group identities and limiting exposure to divergent perspectives (Barberá et al., 2015; Pariser, 2011).

7.2. Limitations and Gaps

While TSF continues to offer a valuable lens for examining mediated influence, its explanatory power is limited by its human-centric and linear assumptions. It often fails to account for:
  • The active role of audiences as content creators, remixers, and selective consumers (Bruns, 2008; Jenkins, 2006).
  • The algorithmic mediation of visibility, reach, and influence (Bucher, 2017; Gillespie, 2014).
  • The diversity of communication goals in digital environments, including community-building and identity work (Baym, 2010; Papacharissi, 2010).
  • The interplay between online and offline influence.
  • The ethical and societal implications of algorithmic and influencer-mediated communication.

7.3. Integrating TSF with Contemporary Models

To address these gaps, future research must integrate TSF insights with network analysis, diffusion of innovations, algorithmic studies, and social psychological models. Hybrid approaches can more accurately capture the interplay of social, technological, and individual factors shaping influence in the digital age (Hilbert et al., 2017; Weeks et al., 2017).

8. Recommendations

Based on this review, the following recommendations are offered for researchers, practitioners, and policymakers:
  • Adopt Hybrid Theoretical Frameworks: Combine TSF with network, diffusion, and algorithmic models to capture the complexity of digital influence.
  • Prioritize Empirical Network Analysis: Employ network metrics and longitudinal data to empirically identify opinion leaders and influence pathways.
  • Investigate Algorithmic Mediation: Critically examine how platform algorithms confer or diminish influence and shape public discourse.
  • Embrace Cross-Platform and Cross-Cultural Research: Study influence diffusion across multiple platforms and diverse cultural settings.
  • Examine Ethical Implications: Address the ethical challenges posed by algorithmic amplification, influencer marketing, and the spread of misinformation.
  • Promote Digital and Algorithmic Literacy: Enhance public understanding of algorithmic curation to foster critical media consumption.
  • Develop Interventions for Malign Influence: Design targeted interventions to mitigate the spread of misinformation, hate speech, and polarization.
  • Foster Interdisciplinary Collaboration: Engage scholars from communication, sociology, computer science, psychology, and policy studies to develop integrative models.

9. Conclusion

The Two-Step Flow theory, conceived in the context of mid-20th-century mass communication, continues to echo in the digital age. Its central premise—that communication is fundamentally a social process mediated by trusted intermediaries—remains pertinent, even as the digital environment introduces new actors, structures, and mechanisms. The review demonstrates that TSF’s value lies not in its original, literal structure but in its recognition of the importance of social mediation. To remain relevant, TSF must be reconceptualized and integrated with complementary models that account for the complexities of digital media—networked flows, algorithmic mediation, diversified leadership, and active audiences. The challenge for future research is to develop nuanced, context-aware, and dynamic frameworks that reflect the interplay of human, social, and technological factors shaping influence in the 21st century.

Funding

The study received no specific financial support.

Institutional Review Board Statement

Not applicable

Transparency

The author confirms that the manuscript is an honest, accurate and transparent account of the study that no vital features of the study have been omitted and that any discrepancies from the study as planned have been explained. This study followed all ethical practices during writing.

Competing Interests

The author declares that there are no conflicts of interest regarding the publication of this paper.

References

  1. Abidin, C. (2016). Visibility labour: Engaging with influencers’ fashion brands and #OOTD advertorial campaigns on Instagram. Media International Australia, 161(1), 86–100. [CrossRef]
  2. Agarwal, R., & Prasad, J. (1998). A conceptual and operational definition of personal innovativeness in the domain of information technology. Information Systems Research, 9(2), 204–215. [CrossRef]
  3. An, J., Quercia, D., & Crowcroft, J. (2014). Partisan sharing: Facebook evidence and societal consequences. Proceedings of the ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW), 443-451. [CrossRef]
  4. Aral, S., & Dhillon, P. S. (2018). Digital influence and the diffusion of behavior. MIT Initiative on the Digital Economy Working Paper.
  5. Aral, S., & Walker, D. (2012). Identifying influential and susceptible individuals in networks. Science, 337(6092), 337–341. [CrossRef]
  6. Bakshy, E., Hofman, J. M., Mason, W. A., & Watts, D. J. (2011). Everyone's an influencer: Quantifying influence on twitter. Proceedings of the Fourth ACM International Conference on Web Search and Data Mining (WSDM), 65-74. [CrossRef]
  7. Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132. [CrossRef]
  8. Bakshy, E., Rosenn, I., Marlow, C., & Adamic, L. A. (2012). The role of social networks in information diffusion. Proceedings of the 21st International Conference on World Wide Web (WWW), 519-528. [CrossRef]
  9. Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531–1542. [CrossRef]
  10. Bauer, R. A. (1964). The obstinate audience: The influence process from the point of view of social communication. American Psychologist, 19(5), 319–328. [CrossRef]
  11. Baym, N. K. (2010). Personal connections in the digital age. Polity Press.
  12. Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
  13. Bennett, W. L., & Manheim, J. B. (2006). The one-step flow of communication. The Annals of the American Academy of Political and Social Science, 608(1), 213–232. [CrossRef]
  14. Bennett, W. L., & Segerberg, A. (2012). The logic of connective action: Digital media and the personalization of contentious politics. Information, Communication & Society, 15(5), 739–768. [CrossRef]
  15. Bimber, B., & Davis, R. (2003). Campaigning online: The internet in U.S. elections. Oxford University Press.
  16. Boulianne, S. (2015). Social media use and participation: A meta-analysis of current research. Information, Communication & Society, 18(5), 524–538. [CrossRef]
  17. boyd, d. (2010). Social network sites as networked publics: Affordances, dynamics, and implications. In Z. Papacharissi (Ed.), A networked self: Identity, community, and culture on social network sites (pp. 39–58). Routledge.
  18. Bruns, A. (2008). Blogs, Wikipedia, Second Life, and beyond: From production to produsage. Peter Lang.
  19. Bruns, A., & Burgess, J. E. (2011). The use of Twitter hashtags in the formation of ad hoc publics. Proceedings of the 6th European Consortium for Political Research (ECPR) General Conference.
  20. Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44. [CrossRef]
  21. Burki, T. K. (2019). Vaccine misinformation and social media. The Lancet Digital Health, 1(6), e258–e259. [CrossRef]
  22. Campbell, C., & Farrell, J. R. (2020). More than meets the eye: The functional components underlying influencer marketing. Business Horizons, 63(4), 469–479. [CrossRef]
  23. Casaló, L. V., Flavián, C., & Guinalíu, M. (2010). Determinants of the success of corporate blogs as a channel to build customer relationships. Management Decision, 48(6), 904–928. [CrossRef]
  24. Casaló, L. V., Flavián, C., & Ibáñez-Sánchez, S. (2020). Influencers on Instagram: Antecedents and consequences of opinion leadership. Journal of Business Research, 117, 510–519. [CrossRef]
  25. Castells, M. (2009). Communication power. Oxford University Press.
  26. Centola, D., & Macy, M. (2007). Complex contagion and the weakness of long ties. American Journal of Sociology, 113(3), 702–734. [CrossRef]
  27. Chadwick, A., Dennis, J., & Smith, A. P. (2021). Politics in the pandemic: Platforms, civic spaces and emerging modes of political participation. Political Studies Association.
  28. Chen, J., Wang, Y., & Wang, X. (2018). Understanding the mechanisms of online health communities: A study of social support and online health opinion leaders. International Journal of Environmental Research and Public Health, 15(4), 764. [CrossRef]
  29. Cheung, C. M., & Thadani, D. R. (2012). The impact of electronic word-of-mouth communication: A literature analysis and integrative model. Decision Support Systems, 54(1), 461–470. [CrossRef]
  30. Choi, S. (2015). The two-step flow of communication in the digital age: Assessing the roles of opinion leaders and web sites in the flow of political information. American Politics Research, 43(3), 438–466. [CrossRef]
  31. Conway, B. A., Kenski, K., & Wang, D. (2015). The rise of Twitter in the political campaign: Searching for intermedia agenda-setting effects in the presidential primary. Journal of Computer-Mediated Communication, 20(4), 363–380. [CrossRef]
  32. Cotter, K., Cho, A., & Rader, E. (2022). Algorithmic mediation of collective action on social media. New Media & Society. [CrossRef]
  33. Couldry, N., & Hepp, A. (2017). The mediated construction of reality. Polity Press.
  34. De Veirman, M., Cauberghe, V., & Hudders, L. (2017). Marketing through Instagram influencers: The impact of number of followers and product divergence on brand attitude. International Journal of Advertising, 36(5), 798–828. [CrossRef]
  35. Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554–559. [CrossRef]
  36. Di Gennaro, C., & Dutton, W. H. (2006). The internet and the public: Online and offline political participation in the United Kingdom. Parliamentary Affairs, 59(2), 299–313. [CrossRef]
  37. Dubois, E., Gruzd, A., & Jacobson, J. (2020). Journalists’ reliance on social media for newsgathering: A comparative study of Canadian and American practices. Digital Journalism, 8(1), 1–20. [CrossRef]
  38. Evans, N. J., Phua, J., Lim, J., & Jun, H. (2017). Disclosing Instagram influencer advertising: The effects of disclosure language on advertising recognition, attitudes, and behavioral intent. Journal of Interactive Advertising, 17(2), 138–149. [CrossRef]
  39. Eysenbach, G., Powell, J., Englesakis, M., Rizo, C., & Stern, A. (2004). Health related virtual communities and electronic support groups: Systematic review of the effects of online peer to peer interactions. BMJ, 328(7449), 1166. [CrossRef]
  40. Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. [CrossRef]
  41. Gibson, R. K., & Cantijoch, M. (2013). Conceptualizing and measuring participation in the digital age: The case of Britain. The British Journal of Politics & International Relations, 15(4), 529–548. [CrossRef]
  42. Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). MIT Press.
  43. Goel, S., Watts, D. J., & Goldstein, D. G. (2012). The structure of online diffusion networks. Proceedings of the 13th ACM Conference on Electronic Commerce (EC), 623-638. [CrossRef]
  44. Gökalp, E., Eren, P. E., & Eren, M. Ş. (2022). Micro versus macro influencers on Instagram: The role of perceived authenticity and homophily. European Journal of Marketing, 56(13), 174–199. [CrossRef]
  45. Gonzalez-Bailon, S., Borge-Holthoefer, J., & Moreno, Y. (2011). Broadcasters and hidden influentials in online protest diffusion. American Behavioral Scientist, 55(10), 1247–1266. [CrossRef]
  46. Goodyear, V. A., Armour, K. M., & Wood, H. (2018). Young people learning about health: The role of online sources and digitally mediated communication. Media Education Research Journal, 26(2), 115–128. [CrossRef]
  47. Granovetter, M. S. (1973). The strength of weak ties. American Journal of Sociology, 78(6), 1360–1380. [CrossRef]
  48. Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science, 363(6425), 374–378. [CrossRef]
  49. Guille, A., Hacid, H., Favre, C., & Zighed, D. A. (2013). Information diffusion in online social networks: A survey. ACM SIGMOD Record, 42(2), 17–28. [CrossRef]
  50. Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the filter bubble? Effects of personalization on the diversity of online news. Digital Journalism, 6(3), 330–346. [CrossRef]
  51. Hansen, D. L., Shneiderman, B., & Smith, M. A. (2011). Analyzing social media networks with NodeXL: Insights from a connected world. Morgan Kaufmann.
  52. Heldman, A. B., Schindelar, J., & Weaver, J. B., III. (2013). Social media engagement and public health communication: Implications for public health organizations being truly “social”. Public Health Reviews, 35(1), 13. [CrossRef]
  53. Hennig-Thurau, T., Gwinner, K. P., Walsh, G., & Gremler, D. D. (2004). Electronic word-of-mouth via consumer-opinion platforms: What motivates consumers to articulate themselves on the internet? Journal of Interactive Marketing, 18(1), 38–52. [CrossRef]
  54. Hesse, B. W., Nelson, D. E., Kreps, G. L., Croyle, R. T., Arora, N. K., Rimer, B. K., & Viswanath, K. (2005). Trust and sources of health information: The impact of the Internet and its implications for health communication. Archives of Internal Medicine, 165(22), 2618–2624. [CrossRef]
  55. Hilbert, M., Vásquez, J., Halpern, D., Valenzuela, S., & Arriagada, E. (2017). One step, two step, network step? Complementary perspectives on communication flows in Twittered citizen protests. Social Science Computer Review, 35(4), 444–461. [CrossRef]
  56. Himelboim, I. (2011). Civil society and online political discourse: The network structure of unrestricted discussions. Communication Research, 38(5), 634–659. [CrossRef]
  57. Himelboim, I., McCreery, S., & Smith, M. (2012). Birds of a feather tweet together: Integrating network and content analyses to examine cross-ideology exposure on Twitter. Journal of Computer-Mediated Communication, 18(2), 40–60. [CrossRef]
  58. Ho, S. S., Lwin, M. O., Lee, E. W. J., & Shin, W. (2021). Examining the effects of social media opinion leaders in evoking positive emotions and support for environmental causes among youths: The case of #SaveLeuserEcosystem. Journal of Communication, 71(1), 1–26. [CrossRef]
  59. Hughes, C., Swaminathan, V., & Brooks, G. (2019). Driving brand engagement through online social influencers: An empirical investigation of sponsored blogging campaigns. Journal of Marketing, 83(5), 78–96. [CrossRef]
  60. Hwang, K. O., Ottenbacher, A. J., Green, A. P., Cannon-Diehl, M. R., Richardson, O., Bernstam, E. V., & Thomas, E. J. (2010). Social support in an Internet weight loss community. International Journal of Medical Informatics, 79(1), 5–13. [CrossRef]
  61. Iyengar, S., & Hahn, K. S. (2009). Red media, blue media: Evidence of ideological selectivity in media use. Journal of Communication, 59(1), 19–39. [CrossRef]
  62. Jenkins, H. (2006). Convergence culture: Where old and new media collide. New York University Press.
  63. Jin, S. V., Muqaddam, A., & Ryu, E. (2019). Instafamous and social media influencer marketing. Marketing Intelligence & Planning, 37(5), 567–579. [CrossRef]
  64. Johnson, N. F., Velásquez, N., Restrepo, N. J., Leahy, R., Gabriel, N., El Oud, S., Zheng, M., Manrique, P., Wuchty, S., & Lupu, Y. (2020). The online competition between pro- and anti-vaccination views. Nature, 582(7811), 230–233. [CrossRef]
  65. Kata, A. (2010). A postmodern Pandora's box: Anti-vaccination misinformation on the Internet. Vaccine, 28(7), 1709–1716. [CrossRef]
  66. Katz, E. (1957). The two-step flow of communication: An up-to-date report on an hypothesis. Public Opinion Quarterly, 21(1), 61–78. [CrossRef]
  67. Katz, E., & Lazarsfeld, P. F. (1955). Personal influence: The part played by people in the flow of mass communications. Free Press.
  68. Khamis, S., Ang, L., & Welling, R. (2017). Self-branding, ‘micro-celebrity’ and the rise of Social Media Influencers. Celebrity Studies, 8(2), 191–208. [CrossRef]
  69. Kite, J., Foley, B. C., Grunseit, A. C., & Freeman, B. (2016). Please like me: Insights into social media use by Australian health promotion agencies to disseminate campaigns. Australian and New Zealand Journal of Public Health, 40(5), 468–474. [CrossRef]
  70. Kwak, H., Lee, C., Park, H., & Moon, S. (2010). What is Twitter, a social network or a news media? Proceedings of the 19th International Conference on World Wide Web (WWW), 591-600. [CrossRef]
  71. Laroche, M., Habibi, M. R., Richard, M.-O., & Sankaranarayanan, R. (2012). The effects of social media based brand communities on brand community markers, value creation practices, brand trust and brand loyalty. Computers in Human Behavior, 28(5), 1755–1767. [CrossRef]
  72. Lazarsfeld, P. F., Berelson, B., & Gaudet, H. (1944). The people's choice: How the voter makes up his mind in a presidential campaign. Duell, Sloan and Pearce.
  73. Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094–1096. [CrossRef]
  74. Lee, J. E., & Watkins, B. (2016). YouTube vloggers’ influence on consumer luxury brand perceptions and intentions. Journal of Business Research, 69(12), 5753–5760. [CrossRef]
  75. Lin, N. (1971). The study of human communication. Bobbs-Merrill.
  76. Livingstone, S. (2004). Media literacy and the challenge of new information and communication technologies. The Communication Review, 7(1), 3–14. [CrossRef]
  77. Lou, C., & Yuan, S. (2019). Influencer marketing: How message value and credibility affect consumer trust of branded content on social media. Journal of Interactive Advertising, 19(1), 58–73. [CrossRef]
  78. Marwick, A. E. (2015). Instafame: Luxury selfies in the attention economy. Public Culture, 27(1 75), 137–160. [CrossRef]
  79. Marwick, A. E., & boyd, d. (2011). I tweet honestly, I tweet passionately: Twitter users, context collapse, and the imagined audience. New Media & Society, 13(1), 114–133. [CrossRef]
  80. McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of mass media. Public Opinion Quarterly, 36(2), 176–187. [CrossRef]
  81. Meraz, S. (2009). Is there an elite hold? Traditional media to social media agenda setting influence in blog networks. Journal of Computer-Mediated Communication, 14(3), 682–707. [CrossRef]
  82. Messing, S., & Westwood, S. J. (2014). Selective exposure in the age of social media: Endorsements trump partisan source affiliation when selecting news online. Communication Research, 41(8), 1042–1063. [CrossRef]
  83. Napoli, P. M. (2014). On the central role of concentration in the communications policy discourse. International Journal of Communication, 8, 2850–2871.
  84. Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
  85. Paek, H.-J., Kim, K., & Hove, T. (2010). Content analysis of antismoking videos on YouTube: Message sensation value, message appeals, and their relationships with viewer responses. Health Education & Behavior, 37(2), 208–229. [CrossRef]
  86. Papacharissi, Z. (Ed.). (2010). A networked self: Identity, community, and culture on social network sites. Routledge.
  87. Pariser, E. (2011). The filter bubble: What the internet is hiding from you. Penguin UK.
  88. Park, D.-H., Lee, J., & Han, I. (2007). The effect of on-line consumer reviews on consumer purchasing intention: The moderating role of involvement. International Journal of Electronic Commerce, 11(4), 125–148. [CrossRef]
  89. Parmelee, J. H. (2014). The agenda-building function of political tweets. New Media & Society, 16(3), 434–450. [CrossRef]
  90. Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? SIDE-effects of computer-mediated communication. Communication Research, 25(6), 689–715. [CrossRef]
  91. Quattrociocchi, W., Scala, A., & Sunstein, C. R. (2016). Echo chambers on Facebook. SSRN Electronic Journal. [CrossRef]
  92. Rainie, L., & Wellman, B. (2012). Networked: The new social operating system. MIT Press.
  93. Robinson, J. P. (1976). Interpersonal influence in election campaigns: Two step-flow hypotheses. Public Opinion Quarterly, 40(3), 304–319. [CrossRef]
  94. Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.
  95. Schouten, A. P., Janssen, L., & Verspaget, M. (2020). Celebrity vs. influencer endorsements in advertising: The role of identification, credibility, and product-endorser fit. International Journal of Advertising, 39(2), 258–281. [CrossRef]
  96. Shao, C., Ciampaglia, G. L., Varol, O., Yang, K.-C., Flammini, A., & Menczer, F. (2018). The spread of low-credibility content by social bots. Nature Communications, 9(1), 4787. [CrossRef]
  97. Sokolova, K., & Kefi, H. (2020). Instagram and YouTube bloggers promote it, why should I buy? How credibility and parasocial interaction influence purchase intentions. Journal of Retailing and Consumer Services, 53, 101742. [CrossRef]
  98. Spears, R., & Lea, M. (1994). Panacea or panopticon? The hidden power in computer-mediated communication. Communication Research, 21(4), 427–459. [CrossRef]
  99. Starbird, K., Maddock, J., Orand, M., Achterman, P., & Mason, R. M. (2014). Rumors, false flags, and digital vigilantes: Misinformation on Twitter after the 2013 Boston Marathon bombing. Proceedings of the iConference 2014, 654-662. [CrossRef]
  100. Starbird, K., Arif, A., & Wilson, T. (2019). Disinformation as collaborative work: Surfacing the participatory nature of strategic information operations. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), Article 127. [CrossRef]
  101. Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton University Press.
  102. Thackeray, R., Neiger, B. L., Smith, A. K., & Van Wagenen, S. B. (2012). Adoption and use of social media among public health departments. BMC Public Health, 12(1), 242. [CrossRef]
  103. Tong, S. T., Van Der Heide, B., Langwell, L., & Walther, J. B. (2008). Too much of a good thing? The relationship between number of friends and interpersonal impressions on Facebook. Journal of Computer-Mediated Communication, 13(3), 531–549. [CrossRef]
  104. Troldahl, V. C. (1966). A field test of a modified "two-step flow of communication" model. Public Opinion Quarterly, 30(4), 609–623. [CrossRef]
  105. Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Hewlett Foundation Working Paper.
  106. Turcotte, J., York, C., Irving, J., Scholl, R. M., & Pingree, R. J. (2015). News recommendations from social media opinion leaders: Effects on media trust and information seeking. Journal of Computer-Mediated Communication, 20(5), 520–535. [CrossRef]
  107. Vaccari, C., & Valeriani, A. (2015). Party campaigners or citizen campaigners? How social media use shapes campaign participation. Party Politics, 21(3), 421–431. [CrossRef]
  108. Vaccari, C., & Valeriani, A. (2016). Political conversations in online networks: Structural properties and determinants of discussion networks on social networking sites. Information, Communication & Society, 19(7), 931–949. [CrossRef]
  109. Valenzuela, S., Halpern, D., & Katz, J. E. (2018). A network approach to the study of political participation and exposure to disagreement online. Social Science Computer Review, 36(2), 147–164. [CrossRef]
  110. Valeriani, A., & Vaccari, C. (2018). Political talk on mobile instant messaging services: A comparative analysis of six countries. Information, Communication & Society, 21(11), 1715–1731. [CrossRef]
  111. Van den Ban, A. W. (1964). A revision of the two-step flow of communications hypothesis. Gazette, 10(3), 237–250. [CrossRef]
  112. van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford University Press.
  113. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. [CrossRef]
  114. Watts, D. J., & Dodds, P. S. (2007). Influentials, networks, and public opinion formation. Journal of Consumer Research, 34(4), 441–458. [CrossRef]
  115. Weeks, B. E., Ardèvol-Abreu, A., & Gil de Zúñiga, H. (2017). Online influence? Social media use, opinion leadership, and political persuasion. International Journal of Public Opinion Research, 29(2), 214–239. [CrossRef]
  116. Weimann, G. (1982). On the importance of marginality: One more step into the two-step flow of communication. American Sociological Review, 47(6), 764–773. [CrossRef]
  117. Wellman, B. (2001). Physical place and cyberplace: The rise of personalized networking. International Journal of Urban and Regional Research, 25(2), 227–252. [CrossRef]
  118. Weng, J., Lim, E.-P., Jiang, J., & He, Q. (2010). TwitterRank: Finding topic-sensitive influential twitterers. Proceedings of the Third ACM International Conference on Web Search and Data Mining (WSDM), 261-270. [CrossRef]
  119. Wright, K. B. (2016). Communication in health-related online social support groups/communities: A review of research on predictors of participation, applications of social support theory, and health outcomes. Review of Communication Research, 4, 65–87. [CrossRef]
  120. Yeo, S. K., McKasy, M., & Cacciatore, M. A. (2021). Filtering out the noise: The role of trust, lay-expert gap, and algorithmic literacy in shaping public perceptions of algorithmic curation. Mass Communication and Society, 24(4), 483–506. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated