Health New Media Res > Volume 7(1); 2023 > Article
Thaker and Ganchoudhuri: Trust in social media is associated with misperceptions about COVID-19


Although social media is a primary means for the general public to access science and health information and can help increase public knowledge, empirical evidence is mixed. Beyond social media exposure, this study investigates whether trust in social media platforms like Facebook, Twitter, and YouTube is related to public knowledge about the coronavirus. The findings, based on data from a nationally representative sample of 3933 people in the United States, show that trust in Facebook and Twitter is negatively associated with knowledge of COVID-19, even after controlling for a number of traditional factors associated with scientific knowledge. Republicans' trust in Twitter contributes to this knowledge gap, albeit the interaction between Republican affiliation and Twitter trust is weak but significant. The findings indicate that, despite increased suppression of fake and misleading information by social media companies, misinformation on social media persists and may lead to harm.


On February 2 2020, the World Health Organization (WHO), warned that we were not only fighting an epidemic, but also a massive “infodemic,” which it defined as “an over-abundance of information - some accurate and some not - that makes it hard for people to find trustworthy sources and reliable guidance when they need it” (WHO, 2020, p. 2). In March 2020, the WHO declared the coronavirus outbreak as a pandemic. At the start of July 2020, there were over 10.4 million cases worldwide, with the United States accounting for over 25% of all deaths due to COVID-19 (, even as it accounts for only 5% of the world population. Preventing deaths and controlling the spread of COVID-19 relies on providing the public with accurate scientific information so that they can take preventive steps.
Internet and social media sites are a primary gateway for information about science and technology news for the public (e.g., Brossard & Scheufele, 2013; Hitlin & Olmstead, 2018) Half of social media users in the United States report regularly seeing science posts (Hitlin & Olmstead, 2018). Social media platforms such as Facebook, Twitter, and YouTube are not only consumed by a large population but have also become a ‘public square’ for discussion and debates about scientific issues, including highly contested issues such as gene editing, climate change and vaccination (Dunn et al., 2017; Ho et al., 2017; Kirilenko et al., 2015; Stocking et al., 2020), among others. The rise of social media as the primary source to access scientific information has coincided with a decline in traditional science journalism (Schäfer, 2017; Weingart & Guenther, 2016). This new form of science communication is unrestricted by traditional gatekeeping of information, thereby providing a platform for scientists, scientific institutions to directly communicate with their publics, with potential to increase public knowledge and engagement.
Communicating science via social media is both challenging and an opportunity for increasing public engagement with science and health issues. One the one hand, science and health news compete for public attention along with other news related to celebrities, politics, and updates shared by family and friends on the social media. On the other hand, science news on social media provides an easy access to a majority of the public who may not be motivated to read science news by visiting websites of science or health agencies or the government. Public opinion surveys indicate that individuals use social media for science and research as much as other topics such as entertainment news to politics (e.g., Hargittai et al., 2018). Indeed, social media news use was found to be more strongly related to trust in science than traditional news (Huber et al., 2019), potentially because of diversity of informational networks (e.g., Bakshy et al., 2015), and engagement of news through trusted social contacts (Media Insight Project, 2017), and others.
Yet, social media sites are also microphones for misinformation (e.g., Bode & Vraga, 2015; Scheufele & Krause, 2019; Wang et al., 2019). Scholars have expressed concern that lack of editorial control of science news, as in traditional channels, will result in public lack of interest or distrust in science (Weingart & Guenther, 2016). Empirical evidence about the role of social media in raising public understanding and engagement with scientific and health issues is also mixed (Ho et al., 2017; Huber et al., 2019; Mueller-Herbst et al., 2020). As a result, this research addresses the need for a more scientific understanding of the impact of social media on public trust and understanding of science (Davies & Hara, 2017; Schäfer, 2016; Weingart & Guenther, 2016).
This study adds to the field of scientific communication in three unique ways. First, while a steady stream of research has examined whether social media use are meeting the potential for improved political knowledge and participation (e.g., Bode & Vraga, 2015; Cacciatore et al., 2018), little is known about the impact of social media on public knowledge (Hargittai et al., 2018; Huber et al., 2019; Mueller-Herbst et al., 2020) and engagement (Ley & Brewer, 2018) with scientific issues. While the field of science communication has moved away from the deficit model of communication, knowledge of scientific issues is still important in shaping public responses to science and health issues (Allum et al., 2008; Chung & Rimal, 2015; Drummond & Fischhoff, 2017), including, for example, in maintaining social distancing due to the coronavirus pandemic (Bridgman et al., 2020).
Second, even the few studies that focus on the role of social media in informing the public on scientific issues, they have found mixed results with some positive (e.g., Huber et al., 2019; Mueller-Herbst et al., 2020) others negative (Ho et al., 2017; Stecula et al., 2020). For example, a recent study found social media use was associated with misperceptions regarding basic facts about COVID-19 in Canada (Bridgman et al., 2020). These studies, however, have only investigated exposure or use, while other important variables, such as trust in the source of information—one of the strongest factors in predicting science knowledge and beliefs (e,g., Connor & Siegrist, 2010; Hmielowski et al., 2013; Malka et al., 2009; Slovic, 1993)—is likely to provide more accurate picture.
To our knowledge, this is the first study that examines how trust in social media, in addition to trust in traditional media and scientific organizations such as the Centers for Disease Control and Prevention (CDC) and National Institutes of Health (NIH), shape public knowledge about scientific and health issues. Trust in social media channels is likely to be a more powerful and important determinant of public attitudes and knowledge than incidental exposure, because trust reflects both a heuristic processing of information as well as result in formation of stable attitudes and beliefs (Song et al., 2018; Stecula et al., 2020). People readily accept information from trusted sources and act on them, particularly on issues that are novel, abstract, or are complex (e.g., Nisbet & Markowitz, 2016) such as the origin and spread of a new disease such as the coronavirus.
Conversely, scholars have argued that lack of trust in vaccines and scientific institutions has resulted in disparities in health and science knowledge among racial minorities (e.g., Mantwill et al., 2015; Rikard, et al., 2016). Previous studies indicate that compared to White respondents, African American, Hispanic, and indigenous respondents display less trust in vaccines in general (Freimuth et al., 2017) as well as in health care authorities (Dudley et al., 2021; Rikard et al., 2016). Moreover, racial minorities are more likely to use social media for news compared to Whites, particularly YouTube (American Press Institute, 2015; Watson, 2023), indicating a further need to study different sources of social media trust among the public.
Third, studies have focused only on one platform such as Facebook (e.g., Mueller-Herbst et al., 2020) or Twitter (Bridgman et al., 2020), or have entirely relied on experimental setup (Bode & Vraga, 2015; Vraga & Bode, 2017). While it is possible that individuals use social media platforms interchangeably, as some scholars (Huber et al., 2019) have implied, varied trust in these platforms have important differences that are yet to be explored (Kim & Lee, 2016). Differences in motivations for use are likely to shape how individuals trust these social media channels and its association with scientific knowledge. Finally, the coronavirus outbreak, unlike past health crises, is the first pandemic that has played out with a majority of the public participating online, with increased social media use due to lockdown. For example, the average time US users spent on social media in 2020 was 65 minutes daily compared to 54 minutes a year before, with a large increase in Facebook and Twitter use (Dixon, 2022). Using data from a nationally representative sample of Americans, this study aims to go beyond exposure and use of social media based research by robustly testing the association between trust in social media channels and knowledge about the coronavirus. Moreover, based on partisan differences in source trust and beliefs about scientific issues (Drummond & Fischhoff, 2017; Gustafson et al., 2019; Hmielowski et al., 2013), we test if such partisan differences are also found in between social media trust and knowledge about the coronavirus.

Literature Review

Social Media, Trust, and Knowledge about Scientific Issues

As argued above, social media has displaced traditional media channels as a primary gateway for the public to access scientific information. On the one hand, a majority of Americans report incidental exposure and attention to science news (Funk et al., 2017; Hitlin & Olmstead, 2018), as well as sharing such information on social media channels such as Twitter, Facebook, and YouTube (Hargittai et al., 2018). Social media use is associated with public trust in science (Huber et al., 2019) and knowledge gain (Huber et al., 2019; e.g., Mueller-Herbst et al., 2020). On the other hand, scientists and scientific institutions can directly communicate to the public through social media with no intermediate gateways, resulting in “the most genuine type of communication and thereby allowing scientists to fulfill their accounting duty effectively by engaging eye to eye with the general public” (Weingart & Guenther, 2016, p. 5). Indeed, experimental studies show that the public doesn’t discount the credibility of scientists—often cited as a reason for scientists’ lack of engagement—even when they advocate on contested issues (Kotcher et al., 2017). Experts can also correct misinformation online (Bode & Vraga, 2015; Vraga & Bode, 2017).
The primary pathway of knowledge gain from social media is through incidental and habitual exposure to issues (Baum, 2002; Bode, 2016; Cacciatore et al., 2018). Research on the potential of incidental learning through passive consumption of mass media is well documented in political communication research (Krugman & Hartley, 1970) and has been shown to be associated with public perceptions of science and technology issues (Nisbet et al., 2016; Nisbet & Markowitz, 2016). Termed as a “gateway effect,” this process explains that as individuals encounter “hard news” alongside more entertaining content, they are more likely to learn and become engaged with various issues. Importantly, as such exposure occurs without activating sensors for motivated or purposive information gain, it may result in knowledge gain particularly among less attentive individuals (Krugman & Hartley, 1970; Tewksbury et al., 2001). Both forms of incidental exposure and active news seeking are related to engagement with news content (Oeldorf-Hirsch, 2018).
In the few studies on social media use and scientific knowledge gain, mixed results are common. Based on a cross-country sample, Huber et al., (2019) found that social media news use media was positively associated with scientific knowledge, and had a stronger association than traditional news use. Based on a national sample, Su et al., (2015) found that people who prefer to primarily rely on online-only sources for science news have a significantly higher level of knowledge of science issues than those who primarily rely on traditional formats for science news.
However, other studies have found that social media use was associated with less awareness and knowledge. Mueller-Herbst (2020) found that Facebook use was negatively associated with gene editing awareness. Instead, they found time spent on Facebook was positively associated with awareness, arguing that more involvement results in higher chance of exposure to scientific news topics and therefore knowledge gain. In a recent study, Stecula et al., (2020) found that compared to traditional media uses, social media users were more likely to be misinformed about vaccines. Bridgman et al., (2020) found that exposure to social media was associated with misconceptions about the COVID-10, which in turn was associated with lower compliance with social distancing measures in Canada.
Social media contains a fair amount of misinformation (e.g., Bridgman et al., 2020, Scheufele & Krause, 2019; Wang et al., 2019) and exposure to inaccurate or fake information can diminish public trust in science and public understanding of scientific issues. For example, several studies in climate change communication show that exposure to inaccurate coverage likely results in beliefs inconsistent with scientific consensus about climate change (e.g., Feldman et al., 2012; Zhao et al., 2011). In other words, use of social media can potentially lead to either increase or decrease in scientifically consistent beliefs and knowledge based on the exposure, attention, and elaboration of information one sees on social media.
This mixed finding between social media use and knowledge gain about scientific issues deserves more attention. We extend the above studies—primarily based on exposure or use of social media—by focusing on the role of trust as a more proximate source of influence on knowledge gain. A number of studies on risk perceptions and science communication highlight that trust serves as a heuristic, particularly on complex scientific and health topics.

Trust in Social Media as an Information Source

While trust has been a central component is several science communication studies, there is a “conceptual confusion” (Lewis & Weigert, 1985, p. 975) about the definition of trust (see Schäfer, 2016; Hendriks et al., 2015). Here, we adopt the definition of trust as commonly accepted (Earle, 2010; Poortinga & Pidgeon, 2003), as “a psychological state comprising the intention to accept vulnerability based upon positive expectations of the intentions or behavior of another” (Rousseau et al., 1998, p. 395). In line with recent conceptualization and measurement of trust (Hendriks et al., 2015), it is inferred as comprising expertise, integrity, and benevolence. Particularly, media as trust intermediary between scientists and public deserves our attention as public primarily comes to know about scientific breakthroughs through media. As Schäfer (2016) argued, trust in science and trust in media as a source of information has to be distinguished to “enable researchers to assess the relative importance of both factors in the production of mediated trust in science” (p. 3).
A large and substantial body of research elucidates that for individuals, information from a trusted source helps reduce cognitive effort and provides a short-cut to make judgements about an issue (e.g., Hmielowski et al., 2013; Malka et al., 2009; Poortinga & Pidgeon, 2003; Siegrist et al., 2005; Slovic, 1993). Individuals are cognitive misers (Fiske & Taylor, 1991) and they may not either have the required time or cognitive ability to understand any one particular issue comprehensively, especially for a complex and emerging pandemic such as the coronavirus that continues to challenge scientists.
A number of studies show that trust in scientists and scientific institutions is one of the strongest predictors of knowledge and general attitudes towards science (e.g., Nisbet & Markowitz, 2016; Stecula et al., 2020). Recently, based on a nationally representative sample of Americans, Stecula et al., (2020) found that distrust of medical authorities was one of the strongest predictor of vaccine misinformation, and this was true across different demographic groups and political beliefs. These associations were resilient over time even with increasing media coverage of CDC and NIH following measles outbreak in the US in 2019. At the same time, scientific advances—such as discovery of vaccine for Zika virus—help increase public trust in science (Hilgard & Jamieson, 2017).
Trust in social media channels may function in a similar way as trust in traditional media and scientific organizations, in that, people who trust these channels are more likely to frequently use them, resulting in repeated exposure that may result in knowledge gain (Bode, 2016; Cacciatore et al., 2018). Moreover, due to the affordances of social media such as liking, sharing, commenting, social media may result in stronger attitudes and beliefs about scientific issues, thereby increasing their attitude certainty and knowledge (Media Insight Project, 2017). Trust in social media can result in either heuristic judgments—incidental exposure—or result in elaboration and discussion through commenting and sharing information with one’s social networks (Ho et al., 2017). To our knowledge, no previous study has evaluated trust in social media sources on shaping public knowledge about scientific issues. We test the relationship by controlling a series of factors that have shown to result in knowledge gain, such as attention to news (e.g., Oschatz et al., 2019), trust in traditional media such as newspapers (e.g., Huber et al., 2019; Stecula et al., 2020), and trust in scientific organizations such as CDC and NIH (e.g., Drummond & Fischhoff, 2017). In addition, a risk perception variable—perceived timing of harm—was also used as a control variable.

Political Polarization and Knowledge

A number of studies report a consistent and emerging polarization along political lines on scientific issues in the US (Dunlap et al., 2016; Feldman et al., 2012; Kahan et al., 2012). Such polarization, worryingly, increases with increase in education and scientific knowledge levels, on a variety of scientific topics such as stem cell research, genetically modified crops, climate change, among others (Bail et al., 2018; Drummond & Fischhoff, 2017; Kahan et al., 2012). Opinion polarization along political lines is a result of partisan media exposure and motivated reasoning. Diehl et al., (2019), for example, found that the association between social news media use and climate change belief was moderated by conservative political ideology. Partisan polarization can increase in relatively short time; Gustafson et al., (2019) found that in just four months, the divide between Democrats and Republicans in support of Green New Deal doubled from 33 percentage points difference in December 2018 to 64 percentage points in April 2019. Myers et al., (2017) found that political ideology has a strong influence on public trust in federal scientific research. While previously untested, it is important to understand if the potential knowledge gain using social media is also aligned along partisan lines. As there are no studies guiding development of hypothesis specific to trust in social media—and the mixed findings of use and social media news use—we seek to answer the following research questions:
RQ1: How is trust in social media as source of accurate information about the coronavirus associated with knowledge about the coronavirus?
RQ2: Are there any partisan differences in the relationship between trust in social media and knowledge about the coronavirus?


Data from a national sample of respondents in the United States were used for the study. A total of N=4493 respondents were recruited by Climate Nexus polling during April 3-7, 2020 using stratified sampling methods matching census parameters for sex, race, age, education, income, and geographic region. Participants received compensation in tune with the specific market segment and respondents preference (e.g., cash, gift cards, reward points). A total of 560 respondents were not included in the final sample due to a variety of reasons including dropping out of the survey soon after starting, completing the survey in less than 28% of the median response time, resulting in a final sample size of 3933. Sampling weights were used to account for small deviations from the census estimates and demographic information related to the sample are listed in Table 1. The dataset is publicly available (Link deleted to keep for blind review).


The dependent variable of scientifically accurate knowledge was measured by asking a series of fifteen questions in the following way: “To the best of your knowledge, are each of the following statements true or false.” The response options were, Yes (1) and No (0); The response categories of “Don’t know” and those who refused to answer were coded as missing data. Eight statements were scientifically accurate relating to symptoms (dry cough (1, Yes, 82%), fever (90%)), spread (“the coronavirus can be spread by people who do not show symptoms,” 91%), “the coronavirus can live on some surfaces for up to three days,” 78%), protection (frequent hand washing (92%), avoiding large gatherings (92%), maintaining 6-feet distance physical distance with others (74%)), and cure (“there is currently no cure for the coronavirus (81%). Five statements were scientifically inaccurate and refereed to a variety of misconceptions including, “Drinking water can flush the coronavirus into your stomach where the acid will kill it (18%),” “antibiotics can prevent or kill the coronavirus (12%),” “there is currently a vaccine available to the public that is proven to prevent coronavirus infection (10%),” “sneezing is a symptom of coronavirus infection” (39%)”, “hand sanitizers are better than soap and water for preventing infections (24%).” Two questions about non-contagion (28%) and immunity (25%) after recovery were not used in the analysis as currently there is no evidence to state either according to WHO, even as these were covered broadly in the media. The dependent variable was computed by adding all the scientifically accurate statements and then subtracting them from the responses to statements inconsistent with WHO information about the COVID-19 (M=5.86, SD=1.98) KR-20 (Kuder-Richardson Formula 20) = .46).

Independent variables

Trust in information from social media channels was gauged by asking respondents, “How much they trust or distrust the following as a source of accurate information about the coronavirus?” The responses ranged from “Strongly distrust” to “Strongly trust.” The “Don’t Know” responses were coded as the middle category, while if the respondent choose not to answer (only three respondents) were coded as missing data. Similarly trust in information from national newspapers (“such as New York Times, Washington Post, USA Today, etc”) was coded on a five-point scale (M=3.53, SD=1.30).
Similarly, trust in scientific institutions was measured on a four-point scale by asking respondents about how much they trust or distrust the following as a source of accurate information: CDC (M=3.35, SD=.76) and NIH (M=3.25, SD=.75).
Attention to the news (M=3.56, SD=.63) was measured by asking respondents, “How closely have you been following news about the coronavirus?” The responses were measured on a four-point scale from “Not at all closely” (1) to “Very closely” (4); Only three respondents who refused to answer the question and were treated as missing data.
Perceived harm was measured by asking respondents, “When do you think the coronavirus will start to harm people in your community?” (M=5.03, SD=1.45). The response options were: “Never” (7%), “More than 3 months from now” (2.4%), “In 2 to 3 months” (3.2%), “In about a month” (9.7%), “In the next 2 to 3 weeks” (23.1%), and “They are being harmed right now” (54.5%). Only one respondent did not answer the question.
Demographic variables, listed in Table 1. Income was measured by asking respondents to estimate their total household income in the past 12 months before taxes: Less than $25,000 (19.8%), $25,000 to $49, 999 (20.9%), $50,000 to $74,999 (18.6%), $75,000 to $99,999 (11%), $100,000 to $149,999 (20.3%), and $150,000 or more (9.4%). Ethnicity was dummy coded to compare those who self-identify as White (62%) compared to others. Region was dummy coded comparing individuals’ living in South to living in Northeast (17%), Midwest (21%), and West (24%). Political party affiliation (M=3.93, SD=1.80) were measured using 7 categories (very liberal Democrat, somewhat liberal Democrat, moderate or conservative Democrat, Independent or non-leaning, liberal or moderate Republican, somewhat conservative Republican, very conservative Republican). Those who did not respond or mentioned no party or not interested in politics were recoded as Independents/non-leaning.

Analytical Strategy

Analysis proceeded in a step-wise fashion by examining bivariate relationships and testing a series of regression models. Moreover, political party identification were coded in multiple ways to test the robustness of the interaction results, including interactions with dummy coded variables. Hayes (2013) SPSS PROCESS macro was used to test interactions between political identity affiliation and trust in social media sources. The bootstrap analysis was conducted with 10,000 iterations and bias-corrected estimates.


Correlations, listed in Table 2, show a moderate negative association between trust in Facebook, Twitter, and YouTube as a source of accurate information about the coronavirus and scientifically accurate knowledge about the coronavirus. In turn, trust in scientific institutions such as CDC and NIH as a source of accurate information are positively associated with knowledge. Attention to news about the coronavirus was positively associated with knowledge. Finally, perceived timing of harm to community was positively associated with knowledge.
Results from the regression analysis show that female (compared to male), older respondents, individuals with higher household income, White (compared to others), and individuals living in Northeast, Midwest, and West (compared to those living in South) were all more likely to score higher on knowledge.
Controlling for demographic variables, trust in Facebook as a source of information about the coronavirus was negatively associated with scientifically accurate knowledge (B = -.28, SE = .03, t=-9.80, p < .001) and so was trust in Twitter (B = -.13, SE = .03, t = -4.36, p < .001). Trust in YouTube (B = .008, SE = .03, t = .28, p = .779) and trust in national newspapers (B = -.01, SE = .03, t = -.42, p = .67) was not significantly associated with knowledge. Trust in traditional scientific institutional sources such as CDC (B = .25, SE = .05, t = 5.11, p < .001) and NIH (B = .23, SE = .05, t = 4.55, p < .001) was significantly associated with knowledge. Attention to news about the coronavirus (B = .11, SE = .05, t = 2.28, p < .05) and timing of harm (B = .24, SE = .02, t = 11.59, p < .001) in the local community—an indicator of risk perceptions—was positively associated with knowledge.
The interaction between trust in Facebook and political party identity was insignificant. The interaction between trust in Twitter and political party identity was found to be weakly and negatively associated with knowledge (B = -.05, SE = .02, t = -2.95, p < .01). There was no significant interaction between trust in YouTube and political party affiliation with knowledge. Figure 1 illustrates that for individuals’ leaning towards the Republican party, increasing trust in Twitter decreased their knowledge in a small albeit significant way, whereas it has no affect on individuals’ leaning towards Democratic party.
Overall, the results suggest that trust in Facebook and Twitter as an accurate information was associated with lower level of knowledge about the coronavirus, controlling for a variety of factors. Moreover, the results suggest that trust in social media channels differently, albeit weakly, affect individuals’ levels of knowledge about the coronavirus based on their political party affiliation.


Correlations, listed in Table 2, show a moderate negative association between trust in Facebook, Twitter, and YouTube as a source of accurate information about the coronavirus and scientifically accurate knowledge about the coronavirus. In turn, trust in scientific institutions such as CDC and NIH as a source of accurate information are positively associated with knowledge. Attention to news about the coronavirus was positively associated with knowledge. Finally, perceived timing of harm to community was positively associated with knowledge. About one in four (23%) American adults admitted to sharing misinformation via social media and 14% said they shared a story knowing it was fake and 16% realized the story was fake after sharing it (Barthel et al., 2016). This misleading or fake information shared among trusted sources such as family and friends creates a chain that likely increases confusion in the public mind. In other words, trust in social media channels is associated with misconceptions about the coronavirus.
This finding is worrying, because several social media channels have recently taken measures to reduce the amount of fake or misinformation. For example, Facebook took a number of steps to combat the spread of miscommunication on its channels, including banning accounts that spread fake information, providing links to authoritative sources such as government health websites, investing in fact-checkers, and in news industry (Jin, 2020). Similar steps have also been taken by Twitter by labelling or removing harmful fake news content from its channels and directing its users to trusted sources of public information (Roth & Pickles, 2020). YouTube has done same, particularly highlighting information from WHO, CDC, and other health agencies. Famously, YouTube, along with Facebook and other companies, removed the viral “Plandemic” conspiracy video that alleged that billionaires aided the spread of the coronavirus to push vaccination, among other fake news. Some of the recommendations by science communication scholars (Smith and Seitz, 2019)—such as labelling, providing myth-correcting information immediately after misleading articles—were already in place by the social media organizations, yet appear to be ineffective. Despite these efforts, a strong and negative association between trust in Facebook and Twitter and knowledge indicates that the battle against fake or misleading information is on-going and harms public knowledge so vital in protecting public health.
A unique finding of this study is that at least one social media channel—Twitter—increases the knowledge gap between Republicans and Democrats, although admitted in a very limited extent. For Republicans, increased trust in Twitter is associated with lesser knowledge about the coronavirus than for Democrats even though the interaction between political affiliation and Twitter trust was small but significant. Trust in Facebook is negatively associated with knowledge irrespective of partisan divide. While there could be no definitive way to state why, it is possible that echo chambers for Republicans is more pronounced in Twitter compared to other social media channels—particularly as President Trump used the platform exclusively. It is possible that Twitter attracts a strong cohort of Republicans who are likely to follow the President and other such elites who likely contribute to a rise in public misperceptions. Previous research suggests that the Twitter is also more likely to attract more partisan views than Facebook (Hitlin & Olmstead, 2018). It is also possible that individuals motivated by information seeking tend to use Twitter and YouTube, which are news and information rich. As news about scientific topics is increasingly polarized and politicized (Chinn et al., 2020; Merkley & Stecula, 2018), it is possible that Twitter merely increases this polarization, either directly or indirectly where Republicans take cues from Democratic elites to reject science (Merkley and Stecula, 2018). Even though the interaction detected between social media trust and political affiliation is weak in this study, more research is warranted to test these claims.
Nevertheless, this finding is consistent with a recent field experiment (Bail et al., 2018): When Republicans who were incentivized to follow messages tweeted by elected officials and opinion leaders with opposing points of view, they expressed substantially more conservative views, indicating a backfiring effect. For Democrats, following conservative Twitter posts, became slightly liberal but the effect was not statistically significant. The authors argued that conservatives hold values that prioritize certainty and tradition, therefore the backfiring effects were particularly strong for them; Liberals value change and diversity and as a result, repeated exposure to opposing ideas may not result in such strong effects. In contrast to information from political elites, however, interacting with expert information on social media can aid in increasing scientifically accurate knowledge and beliefs.
Experimental evidence indicate the potential of expert agencies such as CDC in correcting health misinformation on social media without losing credibility, and that it is more effective for those higher in initial misperceptions (Bode & Vraga, 2015; Smith & Seitz, 2019; Vraga & Bode, 2017). This simple public health intervention can help save lives in times of unprecedented health crisis. The above experimental studies, however, do not report on partisan differences. Several studies evaluating science organizations’ use of Facebook and Twitter, such as CDC and others, indicate that they merely use social media sites for information dissemination, limiting the ability of social media in correcting misinformation online (Dalrymple et al., 2016; Lee et al., 2018; Su et al., 2017).
A worrisome finding is that compared to Whites, people of color appear to have less knowledge about the COVID-19 disease dynamics. The death rates due to COVID-19 among African-Americans as well as indigenous groups is about five times that of non-Hispanic white persons. For Hispanic or Latino persons, the death rate is about four times that of white persons (“COVID-19 in Racial and Ethnic Minority Groups,” 2020). It appears that social inequalities in combination with communication inequalities related to access and knowledge is likely contributing to the higher death rate among persons of color.


A primary limitation is that it is a cross-sectional study and cannot determine if trust in social media resulted in lack of knowledge or people who already have strong misconceptions about the coronavirus—particularly those who mistrust traditional media—tend to trust social media. Our analysis relies on a series of Yes or No questions, similar to True/False questions often used to assess knowledge (Su et al., 2017). A more comprehensive assessment of public information levels are necessary. While the measures of trust were specifically related to accuracy of information about the coronavirus on social media, more refined measures of specific kinds of trust in information on social media will give a comprehensive picture. For example, trust can be measured based on the dimensions of expertise, integrity, and benevolence (Hendriks et al., 2015), even as several studies use just one overall measure of trust. It is unclear why trust in national newspapers was not associated with knowledge. It could be because of a general decline of trust in newspapers or increasing skepticism particularly related to coverage of science news coverage. The intense opinion polarization along political lines is particularly visible in the US and may not be applicable in to other countries. Nevertheless, cultural factors such as individualism and egalitarianism—not considered in this study—have been found to be associated with political partisanship and shown to influence risk perceptions worldwide. Furthermore, the study did not examine any of the content of the science information encountered online. In fact, it may be impossible to curate each individuals’ social media feed to make any conclusive arguments about exposure to kinds of information people encounter on social media. While we were able to detect partisan effects for Republicans who trust Twitter, it is unclear why such an affect was not observed either for Facebook or YouTube, or for Democrats, even as other research indicates that social media driven polarization is particularly high for Republicans (Bail et al., 2018). Future research should further test the impact of network heterogeneity in impacting knowledge gain through trust in social media channels.


Social media is now the primary way through which public access news about science and health issues, even as traditional science journalism is in decline. As a result, trust in social media as an accurate source of information is important for public knowledge and engagement with complex and emerging scientific issues. The findings of this paper are troublesome—trust in social media channels is negatively associated with basic knowledge about the coronavirus even as social media organizations have taken a war effort to control the spread of fake news and misinformation. Our ability to contain the “infodemic” will be also be our key to saving lives due to the coronavirus pandemic.

Figure 1.
Interaction between trust in Twitter and political party affiliation in predicting knowledge about COVID-19. The interaction variables were mean centered to reduce multicollinearity
Table 1.
Demographic Characteristics of this Sample
Variable Weighted (%) Unweighted (%)
18-23 12 10
24-39 27 30
40-55 27 28
56-74 30 28
75 or older 4 4
Male 49 47
Female 51 53
White, non-Hispanic 62 65
Black, non-Hispanic 12 13
Hispanic/Latino 17 16
Other or 2+ races, non-Hispanic 8 7
Less than high school 9 3
High school 29 26
Some college, no degree 20 29
Associate's degree 9 12
Bachelor’s degree or higher 21 19
Master's degree 10 8
Professional or doctorate degree 2 2
Political Party
Democrat 44 44
Independent 11 11
Republican 38 37
No party/Not interested 7 7
Northeast 17.4 19
Midwest 21 21
West 24 39
South 38 21

Note. N = 3933

Table 2.
1 2 3 4 5 6 7 8 9
1 Knowledge 1
2 Trust national newspaper 0.01 1
3 Trust Facebook -.29** .19** 1
4 Trust Twitter -.25** .26** .62** 1
5 Trust YouTube -.24** .19** .62** .61** 1
6 Trust CDC .19** .31** .05** .04* .04* 1
7 Trust NIH .17** .36** .09** .09** .06** .63** 1
8 Attention to news about coronavirus .06** .15** .11** .11** .11** .10** .15** 1
9 Harm .27** .07** -.13** -.09** -.14** .13** .11** .13** 1


*p < .05,

**p < .001

Table 3.
Regression Analysis Predicting Knowledge of the Coronavirus
B SE β 95% CI
(Constant) 2.2 0.23 1.71 2.60 <.001
Female 0.6 0.06 0.14 0.43 0.66 <.001
Age 0.3 0.03 0.13 0.19 0.31 <.001
Education -0.01 0.03 0.00 -0.07 0.06 0.83
Household income 0.14 0.03 0.07 0.07 0.20 <.001
White 0.47 0.06 0.11 0.34 0.59 <.001
Northeast 0.07 0.03 0.04 0.01 0.13 0.02
Midwest 0.12 0.03 0.06 0.06 0.18 <.001
West 0.11 0.03 0.05 0.04 0.17 0.00
Political party identity -0.07 0.02 -0.07 -0.11 -0.04 <.001
Trust newspapers -0.01 0.03 -0.01 -0.06 0.04 0.676
Trust Facebook -0.28 0.03 -0.19 -0.33 -0.22 <.001
Trust Twitter -0.13 0.03 -0.09 -0.19 -0.07 <.001
Trust YouTube 0.01 0.03 0.01 -0.05 0.07 0.78
Trust CDC 0.25 0.05 0.09 0.15 0.34 <.001
Trust NIH 0.23 0.05 0.09 0.13 0.33 <.001
Attention News 0.11 0.05 0.03 0.02 0.21 0.02
Harm 0.24 0.02 0.17 0.20 0.28 <.001
Trust Facebook*Political party 0.02 0.02 0.03 -0.01 0.05 0.13
Trust Twitter*Political party -0.05 0.02 -0.06 -0.08 -0.02 0.00
Trust YouTube*Political party -0.01 0.02 -0.01 -0.04 0.03 0.63

Note. n= 3825. Δ R2 = .22. CI = confidence interval; LL = lower limit; UL = upper limit. Female was dummy coded with reference to male. Northeast, Midwest, West were dummy coded with reference to region South. Prior to testing interaction, variables were mean centered.


Allum, N., Sturgis, P., Tabourazi, D., & Brunton-Smith, I. (2008). Science knowledge and attitudes across cultures: A metaanalysis. Public Understanding of Science,
American Press Institute (2015, August 21). Race, ethnicity, and the use of social media for news.
Anderson, A. A., Yeo, S. K., Brossard, D., Scheufele, D. A., & Xenos, M. A. (2018). Toxic talk: How online incivility can undermine perceptions of media. International Journal of Public Opinion Research, 30(1), 156-168.
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., Lee, J., Mann, M., Merhout, F., & Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216-9221.
crossref pmid pmc
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132.
crossref pmid
Barthel, M., Mitchell, A., & Holcomb, J. (2016, December 15). Many Americans believe fake news is sowing confusion. Pew Research Center’s Journalism Project.
Baum, M. A. (2002). Sex, lies, and war: How soft news brings foreign policy to the inattentive public. The American Political Science Review, 96(1), 91-109. JSTOR
Bode, L. (2016). Political news in the news reed: Learning politics from social media. Mass Communication and Society, 19(1), 24-48.
Bode, L., & Vraga, E. K. (2015). In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619-638.
Bridgman, A., Merkley, E., Loewen, P. J., Owen, T., Ruths, D., Teichmann, L., & Zhilin, O. (2020). The causes and consequences of COVID-19 misperceptions: Understanding the role of news and social media. Harvard Kennedy School Misinformation Review. 1(3),
crossref pmid pmc
Brossard, D., & Scheufele, D. A. (2013). Science, new media, and the public. Science, 339(6115), 40-41.
crossref pmid
Cacciatore, M. A., Yeo, S. K., Scheufele, D. A., Xenos, M. A., Brossard, D., & Corley, E. A. (2018). Is Facebook making us dumber? Exploring social media use as a predictor of political knowledge. Journalism & Mass Communication Quarterly, 95(2), 404-424.
Chinn, S., Hart, P. S., & Soroka, S. (2020). Politicization and polarization in climate change news content, 1985-2017. Science Communication.
Chung, A. H., & Rimal, R. N. (2015). Revisiting the importance of knowledge: From Namibia, a case for promoting knowledge by campaigns to reduce stigma. Health Education & Behavior, 42(2), 249-256.
crossref pmid
Connor, M., & Siegrist, M. (2010). Factors influencing people’s acceptance of gene technology: The role of nnowledge, health expectations, naturalness, and social trust. Science Communication, 32(4), 514-538.
Dalrymple, K. E., Young, R., & Tully, M. (2016). “Facts, not fear”: Negotiating uncertainty on social media during the 2014 Ebola crisis. Science Communication.
Davies, S. R., & Hara, N. (2017). Public science in a wired world: How online media are shaping science communication. Science Communication, 39(5), 563-568.
Dixon, S. (2022). Social media use during COVID-19 worldwide-Statistics & facts. Statistica.
Drummond, C., & Fischhoff, B. (2017). Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proceedings of the National Academy of Sciences, 114(36), 9587-9592.
crossref pmid pmc
Dudley, M. Z., Limaye, R. J., Salmon, D. A., Omer, S. B., O’Leary, S. T., Ellingson, M. K., ..., & Chamberlain, A. T. (2021). Racial/ethnic disparities in maternal vaccine knowledge, attitudes, and intentions. Public Health Reports, 136(6), 699-709.
crossref pmid pmc pdf
Dunlap, R. E., McCright, A. M., & Yarosh, J. H. (2016). The political divide on climate change: Partisan polarization widens in the US. Environment: Science and Policy for Sustainable Development, 58(5), 4-23.
Dunn, A. G., Surian, D., Leask, J., Dey, A., Mandl, K. D., & Coiera, E. (2017). Mapping information exposure on social media to explain differences in HPV vaccine coverage in the United States. Vaccine, 35(23), 3033-3040.
crossref pmid
Earle, T. C. (2010). Trust in risk management: A model‐based review of empirical research. Risk Analysis, 30(4), 541-574.
crossref pmid
Feldman, L., Maibach, E. W., Roser-Renouf, C., & Leiserowitz, A. (2012). Climate on cable: The nature and impact of global warming coverage on Fox News, CNN, and MSNBC. The International Journal of Press/Politics, 1940161211425410, 1940161211425410
Fiske, S. T., & Taylor, S. E. (1991). Social cognition (2nd ed.). Mcgraw-Hill Book Company
Funk, C., Gottfried, J., & Mitchell, A. (2017, September 20). Most Americans express curiosity in science news, but a minority are active science news consumers. Pew Research Center’s Journalism Project.
Gustafson, A., Rosenthal, S. A., Ballew, M. T., Goldberg, M. H., Bergquist, P., Kotcher, J. E., Maibach, E. W., & Leiserowitz, A. (2019). The development of partisan polarization over the Green New Deal. Nature Climate Change, 9(12), 940-944.
Hargittai, E., Füchslin, T., & Schäfer, M. S. (2018). How do young adults engage with science and research on social media? Some preliminary findings and an agenda for future research. Social Media + Society, 4(3), 2056305118797720.
Hendriks, F., Kienhues, D., & Bromme, R. (2015). Measuring laypeople’s trust in experts in a digital age: The Muenster Epistemic Trustworthiness Inventory (METI). PLOS ONE, 10(10), e0139309.
crossref pmid pmc
Hilgard, J., & Jamieson, K. H. (2017). Does a scientific breakthrough increase confidence in science? News of a Zika vaccine and trust in science. Science Communication, 39(4), 548-560.
Hitlin, P., & Olmstead, K. (2018, March 21). The science people see on social media. Pew Research Center Science & Society.
Hmielowski, J. D., Feldman, L., Myers, T. A., Leiserowitz, A., & Maibach, E. (2013). An attack on science? Media use, trust in scientists, and perceptions of global warming. Public Understanding of Science,
Ho, S. S., Yang, X., Thanwarani, A., & Chan, J. M. (2017). Examining public acquisition of science knowledge from social media in Singapore: An extension of the cognitive mediation model. Asian Journal of Communication, 27(2), 193-212.
Huber, B., Barnidge, M., Gil de Zúñiga, H., & Liu, J. (2019). Fostering public trust in science: The role of social media. Public Understanding of Science, 28(7), 759-777.
crossref pmid
Jin, K. X. (2020, June 24). Keeping people safe and informed about the Coronavirus. Facebook.
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735.
Kim, C., & Lee, J. K. (2016). Social media type matters: Investigating the relationship between motivation and online social network heterogeneity. Journal of Broadcasting & Electronic Media, 60(4), 676-693.
Kirilenko, A. P., Molodtsova, T., & Stepchenkova, S. O. (2015). People as sensors: Mass media and local temperature influence climate change discussion on Twitter. Global Environmental Change, 30, 92-100.
Kotcher, J. E., Myers, T. A., Vraga, E. K., Stenhouse, N., & Maibach, E. W. (2017). Does engagement in advocacy hurt the credibility of scientists? Results from a randomized national survey experiment. Environmental Communication, 11(3), 415-429.
Krugman, H. E., & Hartley, E. L. (1970). Passive learning from Television. Public Opinion Quarterly, 34(2), 184-190.
Lee, N. M., VanDyke, M. S., & Cummins, R. G. (2018). A missed opportunity?: NOAA’s use of social media to communicate climate science. Environmental Communication, 12(2), 274-283.
Lewis, J. D., & Weigert, A. (1985). Trust as a social reality. Social Forces, 63(4), 967-985.
Ley, B. L., & Brewer, P. R. (2018). Social media, networked protest, and the March for Science. Social Media + Society, 4(3), 2056305118793407.
Malka, A., Krosnick, J. A., & Langer, G. (2009). The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Analysis, 29(5), 633-647.
crossref pmid
Mantwill, S., Monestel-Umaña, S., & Schulz, P. J. (2015). The relationship between health literacy and health disparities: a systematic review. PloS one, 10(12), e0145455,
crossref pmid pmc
Media Insight Project. (2017, March 20). ‘Who shared it?’ How Americans decide what news to trust on social media. American Press Institute.
Merkley, E., & Stecula, D. A. (2018). Party elites or manufactured doubt? The informational context of climate change polarization. Science Communication,
Mueller-Herbst, J. M., Xenos, M. A., Scheufele, D. A., & Brossard, D. (2020). Saw it on Facebook: The role of social media in facilitating science issue awareness. Social Media + Society, 6(2), 2056305120930412.
Myers, T. A., Kotcher, J., Stenhouse, N., Anderson, A. A., Maibach, E., Beall, L., & Leiserowitz, A. (2017). Predictors of trust in the general science and climate science research of US federal agencies. Public Understanding of Science, 26(7), 843-860.
crossref pmid
Nisbet, M. C., & Markowitz, E. (2016). Americans’ attitudes about science and technology: The social context for public communication. AAAS Commissioned Review,
Nisbet, M. C., Scheufele, D. A., Shanahan, J., Moy, P., Brossard, D., & Lewenstein, B. V. (2016). Knowledge, reservations, or promise?: A media effects model for public perceptions of science and technology. Communication Research.
Oschatz, C., Maurer, M., & Haßler, J. (2019). Learning from the news about the consequences of climate change: An amendment of the cognitive mediation model. Journal of Science Communication, 18(2), A07.
Poortinga, W., & Pidgeon, N. F. (2003). Exploring the dimensionality of trust in risk regulation. Risk Analysis, 23(5), 961-972.
crossref pmid
Rikard, R. V., Thompson, M. S., McKinney, J., & Beauchamp, A. (2016). Examining health literacy disparities in the United States: a third look at the National Assessment of Adult Literacy (NAAL). BMC public health, 16, 1-11.
crossref pmid pmc pdf
Roth, Y., & Pickles, N. (2020, May 11). Updating our approach to misleading information. Twitter.
Rousseau, D. M., Sitkin, S. B., Burt, R. S., & Camerer, C. (1998). Not so different after all: A cross-discipline view of trust. Academy of Management Review, 23(3), 393-404.
Schäfer, M. S. (2016). Mediated trust in science: Concept, measurement and perspectives for the `science of science communication’. Journal of Science Communication, 15(5), C02.
Schäfer, M. S. (2017). How changing media structures are affecting science news coverage. The Oxford Handbook of the Science of Science Communication, 51-57, 51-57.
Scheufele, D. A., & Krause, N. M. (2019). Science audiences, misinformation, and fake news. Proceedings of the National Academy of Sciences, 116(16), 7662-7669.
crossref pmid pmc
Siegrist, M., Gutscher, H., & Earle, T. (2005). Perception of risk: The influence of general trust, and general confidence. Journal of Risk Research, 8(2), 145-156.
Slovic, P. (1993). Perceived risk, trust, and democracy. Risk Analysis, 13(6), 675-682.
Smith, C. N., & Seitz, H. H. (2019). Correcting misinformation about neuroscience via social media. Science Communication, 41(6), 790-819.
Song, H., McComas, K. A., & Schuler, K. L. (2018). Source effects on psychological reactance to regulatory policies: The Role of Trust and Similarity. Science Communication, 40(5), 591-620.
Stecula, D. A., Kuru, O., & Jamieson, K. H. (2020). How trust in experts and media use affect acceptance of common anti-vaccination claims. Harvard Kennedy School Misinformation Review, 1(1).
crossref pmid pmc
Stocking, G., Matsa, K. E., & Khuzam, M. (2020, June 24). As COVID-19 emerged in U.S., Facebook posts about it appeared in a wide range of public pages, groups. Pew Research Center’s Journalism Project.
Su, L. Y.-F., Akin, H., Brossard, D., Scheufele, D. A., & Xenos, M. A. (2015). Science news consumption patterns and their implications for public understanding of science. Journalism & Mass Communication Quarterly.
Su, L. Y.-F., Scheufele, D. A., Bell, L., Brossard, D., & Xenos, M. A. (2017). Information-sharing and community-building: Exploring the use of Twitter in science public relations. Science Communication, 39(5), 569-597.
Tancoigne, E. (2019). Invisible brokers: “Citizen science” on Twitter. Journal of Science Communication, 18(6), A05.
Tewksbury, D., Weaver, A. J., & Maddex, B. D. (2001). Accidentally informed: Incidental news exposure on the World Wide Web. Journalism & Mass Communication Quarterly, 78(3), 533-554.
Vraga, E. K., & Bode, L. (2017). Using expert sources to correct health misinformation in social media. Science Communication, 39(5), 621-645.
Watson, A. (2023, May 11). Social media news consumption frequency in the U.S. 2022, by ethnicity. Statista.
Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine, 240, 112552.
crossref pmid pmc
Weingart, P., & Guenther, L. (2016). Science communication and the issue of trust. Journal of Science Communication, 15(5), C01.
Zhao, X., Leiserowitz, A. A., Maibach, E. W., & Roser-Renouf, C. (2011). Attention to science/environment news positively predicts and attention to political news negatively predicts global warming risk perceptions and policy support. Journal of Communication, 61(4), 713-731.
METRICS Graph View
  • 0 Crossref
  •  0 Scopus
  • 3,264 View
  • 113 Download
Related articles

Editorial Office
1 Hallymdaehak-gil, Chuncheon 24252, Republic of Korea
Tel: +82-33-248-3255    E-mail:                

Copyright © 2024 by Health & New Media Research Institute.

Developed in M2PI