How Severe is the Problem?

“Many people are likely to be misinformed, not only inaccurate in their factual beliefs but confident that they are right.” So say Kuklinski and collegues (2000, p.809); modest enough, but in the same article, they find that most people are misinformed about welfare, crime, the proportion of minorities (see also Wong et al., 2012), and foreign affairs (see also Kull, Ramsay, & Lewis, 2010). Other scholars have observed severe misinformation on issues ranging from the economy (Bartels, 2002) to national debt and tax policy (Flynn et al., 2017). And recent elections in the U.S. have been characterized as containing “an alarming lack of honesty in political communication” (Hameleers & Van der Meer, 2020). Even outside of elections, the American political system currently “abounds” with misinformation (Bode & Vraga, 2015, p.621).

It is not merely the absolute amount of misinformation, however, that constitutes the problem, it is also the perception of misinformation. The ubiquitous discussion of “fake news” would fit in here. In the U.S., more than two thirds of people report being concerned about their ability to separate what is real and fake when it comes to news (Reuters, 2019). Clearly the majority of Americans believe that not only does misinformation exist, but it is a problem that is incredibly widespread and severe. Unfortunately, the wariness the general public has about misinformation may only be encouraging them to retreat into their ideological bunkers and brand all opposing news and information as “fake news” or misinformation. This is a trend that is exacerbated by the general susceptibility that Americans have towards conspiracy theories, another form of misinformation. I will touch more on this later, but conspiracy theory belief is incredibly widespread. Over half of the electorate believes in at least one conspiracy theory (Oliver & Wood, 2014). And this unfortunate trend of citizens’ believing confidently in false information or conspiracy theories seems only to be increasing, based on recent events. The phenomenon of QAnon would be one example. Another, more serious example, is the fact that one third of Americans believe that President Biden only won the election due to fraud (CNN, 2021). This last finding underscores the problem that misinformation has become in America. It is hard to underestimate how normatively troubling it is that one in three Americans believe the government has no legitimacy, and should therefore have no authority in making laws or expecting their compliance.

It is important to note that by misinformation I mean explicitly false information and not simply the different interpretations of fact that arise from partisan reasoning, where Republicans and Democrats see the same events from differing perspectives (as in Bartels, 2002 or Gaines et al., 2007). Nor is misinformation simply ignorance. I follow the example of Kuklinski et al. (2000) who differentiate being uninformed with misinformed; misinformed is confidently holding beliefs that are wrong (p.793), but which are presented as being factually correct (see also Thorson, 2016). I will not stress degree of confidence in this essay, but Kuklinski and colleagues do find that the most confident in their views are, ironically, the most misinformed.

Consequences of Misinformation

There are many deleterious consequences of misinformation in contemporary American politics. I will briefly discuss the following consequences: polarization, diminishing trust and cohesion, anger, apathy, diminishing deliberation, worse policy outcomes, reduced citizen competence, greater intolerance and support for undemocratic action, and greater network homogeneity.

Polarization

Polarization is not only a consequence of misinformation but a cause (see Del Vicario et al., 2016). Through motivated reasoning, the highly partisan are the most ‘motivated’ to defend their existing beliefs, as those beliefs are both more important to their identity and have come with more costly effort (see Taber & Lodge, 2006 or Kunda, 1990). That makes highly polarized environments (whether in terms of ideology or partisanship) especially vulnerable to misinformation. As the stakes of defending one’s beliefs go up, concern over potential misinformation (if it comes from a friendly source) goes down, or at least there is less incentive to thoroughly fact-check friendly information.

Polarization is also a consequence, however. Because misinformation is more likely to be perpetuated and created by the highly motivated (whether in terms of partisanship or ideology), it tends to represent an extreme viewpoint. As these more marginal or radical viewpoints proliferate, that can drown out the more moderate voices in the public sphere (Au, Ho, & Chiu, 2021). Since fake news is often antagonistic (think of accusations that the Democratic party is running a child-sex ring), it can also contribute to polarization by making the opposition seem more criminal or evil.

I often think of my time as an intern in a Congressional Office in Washington, D.C. as an example of polarization. One of my roles was to answer constituent phone calls, and these were usually impassioned pleas for the Congressman to do something, anything to stop President Obama’s agenda since his goal was to “literally destroy the country.” It was clear that these callers weren’t using that phrase merely for rhetorical effect. In my long conversations with them (very one-sided conversations to be fair) I learnt how President Obama’s radical Islamic agenda was formulated to get him into office with the sole intention of destroying the constitution piece by piece, and eventually setting up an authoritarian state. How did these citizens arrive at these deluded views? Clearly from steady consumption of political misinformation. And, as a consequence, these extreme views gained from information that was factually wrong (i.e. that President Obama is Muslim) led to a widening chasm between their ideological self-placement and where they believed President Obama’s ideological self-placement was. That is polarization. It makes sense though; if you truly believe that someone’s goal is to destroy one’s country, compromise with that person is impossible, and the political gulf between you will seem enormous.

Adoption of misinformation by one side will also encourage hostility from the other side. A good example of this is the anti-vaccer movement based on an erroneous, retracted scientific study that has led to a massive and heated online debate which has increased polarization on an issue that had no previous polarization (Schmidt et al., 2018). Ridiculing those who hold this misinformed belief about vaccinations only serves to harden their attitudes. As MacKuen et al. (2010) describe, when people feel psychologically threatened or ridiculed, they become angry and are more likely to engage in defensive processing. In this way, the extreme rhetoric and angry recriminations that result from the false claims of misinformation only exacerbate polarization and make compromise more difficult.

Greater Anger

A misinformed belief or conspiracy theory can lead to greater anger when those beliefs falsely implicate a political leader, party, or segment of society. The process works thusly: at the heart of much misinformation and conspiracy theories is a desire to reduce the complexity of life and provide order to what seems like an incoherent, unconnected thread of highly complicated or random events (Miller et al., 2016). As Miller and colleagues go on to explain, as part of this simplification, the conspiracy theory usually reveals that a powerful person or group is moving behind the scenes to secure these outcomes (p.825). This simultaneously provides order, reduces anxiety, and provides someone to blame for political events. A good example of this process (although not related to a conspiracy theory) is sociotropic voting. In trying to reduce the enormous complexity of a nation’s economy into a single metric, citizens often overestimate the role of the President in controlling economic conditions. When economic conditions are bad, this leads to blame and anger (Conover & Feldman, 1986, p.72). In much the same way, the hidden sources of power that citizens falsely believe in (the deep state, the Illuminati, the Jews, etc.) generate anger when those misinformed beliefs indicate a convenient scapegoat.

Greater Apathy

As indicated earlier in the 2019 Reuters report on media environment, a majority of Americans indicate that they are worried about their ability to detect misinformation in their news environment. This inability to tell genuine information from misinformation can lead to apathy. Specifically, scholars use the term “reality apathy” to describe a process in which an overload of false information invokes a feeling of disorientation, indifference, and unwillingness to engage with the political world (Wehsener, n.d). This is the mental equivalent of throwing up one’s hands in despair–what’s the point when one cannot tell good information from bad? This apathy that results from a high-misinformation environment (or perceived high-misinformation environment) obviously has the potential to bleed into civic participation.

Lack of Social Trust and Cohesion

The increased polarization, anger, and apathy will (together with misinformation) likely have a negative impact on the social cohesion or generalized trust in American society. This is arguably most critical in terms of trust in public institutions and government. For example, misinformation about the government’s vaccine roll out has led to erroneous beliefs such as the assertion that the Covid vaccine leads to the insertion of a tracking device, or even that it inserts a time-released death chemical into one’s body (Loomba et al., 2021). In countries where such beliefs occur, it has been led to an erosion of trust in the government by those who hold such beliefs, especially in terms of healthcare (Lovari, 2020). Of course, that is a specific example of a general trend. Anytime one holds misinformed and hostile beliefs about either the government, or other groups in society, is likely to experience a reduction in one’s generalized trust or sense of social cohesion.

Greater Intolerance

Tolerance is often defined as extending the full rights of citizenship to groups one dislikes (Sullivan et al., 1982). However, there is debate as to whether tolerance should extend to groups that others see as actively harming others or democracy (e.g. Gibson, 1988). Misinformation aggravates that debate because it often impugns the motives of the “other side”. If I erroneously believe Democrats are actively involved in human sex trafficking, the idea that I tolerate such a group requires a much greater commitment.

Worse Policy Outcomes

Misinformation is often about policies, and this erroneous information can often lead to worse policies (or at least policy outcomes that are more divorced from reality). For example, the widespread misinformation citizens have absorbed about welfare have led to policies that severely overestimate the scope of welfare (Kuklinski et al., 2000, p.806). As Flynn et al. put it, misinformation has “distorted people’s views about some of the most consequential issues in politics” (2017, p.127). Misinformation about a policy could also lead citizens to reject it even when it would be beneficial. The covid vaccine is a salient recent example here.

Worse Deliberation, Representation, and Citizen Competence

These categories are grouped together because they are so interconnected, and misinformation has a caustic effect on them all. Jeff Mondak often remarks that the first rule of citizenship is that citizens must possess at least some knowledge about the political system. They must, therefore, be able to tell good information from bad. This echoes the normative burden that Converse (1964), and many subsequent scholars have placed on citizens–that they should be (at least to some degree) informed. A belief in factually incorrect information, therefore, indicates that citizens are less able to meet the competence requirement to participate meaningfully in a democracy.

Although other scholars have lessened the knowledge expectations of citizens, it does not necessarily exonerate citizens in their tendency to believe misinformation. When Fiorina and Morris (1981, p.5) say that “citizens are not fools”, and merely need to “see or feel” the result of policies, rather than understand them, that sentiment looks less rosy given the specter of misinformation. How can I accurately see or feel the result of any given policy provided I have been given completely misleading information about that policy? I am not the passive experiencer of policy that Fiorina and Morris describe; instead, I am an anxious and/or angry policy experiencer because I believe an extreme (and false) claim about how disastrous said policy will be.

As part of this other strain of scholarship, many researchers have turned to heuristics (e.g. Carmines and Kuklinski 1990; Lupia 1994; Mondak 1993; Popkin 1991; Sniderman, Brody, and Tetlock 1991) as a way to solve the problem of a lack of citizen knowledge. And although subsequent research has shown that heuristics aren’t necessarily the silver bullet they were originally predicted to be, especially for the least knowledgeable citizens (Lau & Redlawsk, 2001), they are likely to perform even worse with the rising tide of misinformation. I’ll use the political heuristic of partisan identity as an example. If I’m uninformed about politics, so the theory goes, I can still make rational decisions by relying on the informational shortcut that is my membership in a political party. If I’m a Republican, however, and the Republican candidate is pushing a misinformation campaign that obfuscates the truth behind a policy that would help me, then relying on that partisan cue will lower my payoff.

In a similar fashion, those theories that proposed that citizens could learn all they need from relying on third parties and institutions (most persuasively set out in Lupia and McCubbins’ The Democratic Dilemma) also perform less well with excessive deliberate misinformation. If citizens are unable to verify the truth, impose penalties for lying, or rely on a benevolent third party to separate truth from error, I fail to see how they will learn the information they need.

In terms of representation, some scholarship already suggest that voters are unable to punish extreme candidates because they don’t know enough about politics (Bawn et al., 2012, p.577). Adoption of misinformation is unlikely to help in this regard; in fact, it may lead to citizens rewarding candidates who adopt the same extreme misinformation as they do. Although this may be good news for scholars who believe that representation represents as tight a congruency as possible between constituents’ attitudes and representatives’ (e.g. Miller & Stokes, 1963), it is less attractive for those who hold a more Burkean view of representation; that is, that elected officials should represent their constituents’ interests rather than their will. From this conception of representation, a candidate who either believes the same misinformation as their constituencies (or perhaps who adopts the same beliefs in a more Machiavellian sense) is not representing their constituents’ interests well because they are aiding in the perpetuation of misinformation. In a similar vein, how can voters hold elected officials accountable for their actions when they are so misinformed about what those actions are? We can also view this from the legislator’s perspective. Legislators (and other elites) may be more reluctant to engage with citizens whom they perceive to be misinformed, reducing both responsiveness and constituent exposure to correct information (Flynn et al., 2017, p.143).

I fear, therefore, that too high a proportion of misinformation essentially makes the more normatively-pleasing deliberative or Republican democracy impossible to achieve. Apart from the concerns I have already brought up, there are two additional and compelling reasons to believe this. Firstly, “facts” are the “foundation for deliberation about larger issues. They prevent debates from becoming disconnected from the material conditions they attempt to address” (Carpini & Keeter, 1996, p.11). Citizens already have a hard time seeing eye to eye on the same facts (e.g. Gaines et al., 2007); this problem is exacerbated when the facts themselves (not the interpretations) are in question. There can be no meaningful deliberation without such consensus. Instead, we have two groups of people talking past each other with their own sets of “facts.” This has been referred to as an era of “post-factual relativism” (Van Aelst et al., 2017). Biden stole the election; no, he didn’t. Vaccines cause autism; no, they don’t. Barack Obama is Muslim; no, he’s not. And so the “debate” goes on.

A second blow to any hope of deliberation is the increasing homogeneity in discussion groups that misinformation is likely to induce. I’ve already written about the increase in polarization, and that goes hand in hand with a greater desire to avoid cross-cutting views. Such cross-cutting views can introduce dissonant information, which may be uncomfortable. We know citizens avoid these cross-cutting viewpoints at the best of times (Mutz, 2002), but this discomfort is magnified by the gulf between the two sides that polarization encourages. Therefore, citizens in such situations will naturally prefer more homogeneous groups in which to discuss politics, and will avoid heterogeneous discussion groups. That, of course, is fundamentally at odds with the norms of deliberation that are often equated to democracy. There can be no real deliberation if we are only comfortable deliberating within our own groups, or discounting contrary information by definition of its contrary sourcing (i.e. labeling it “fake news”).

In conclusion of this section, I refer back to a remarkably prescient statement; McClosky (1964, p.377) remarked that his greatest fear for democracy moving forward was that Americans would be so ignorant about the institutions of democracy, that they would end up undermining them in a mistaken belief that they were safeguarding them. The storming of the capitol by a group of apparently patriotic citizens who wanted to “defend the constitution” is an example of misinformation (“stop the steal”) leading to that very undermining.

Who Do We Blame for Misinformation?

It is difficult to assign blame to only one player in the political system, as the problem of misinformation is really a highly interconnected system. And there are multiple avenues of blame. Elites who generate (or perpetuate) false information in order to gain electoral advantage bear some responsibility, but so do a general public who are either too ignorant, apathetic, or partisan to be able to discriminate between truth and error. And, of course, the media that is supposed to aid the general public in that quest, but is failing miserably, also bears some responsibility. So, I argue that the most accurate depiction of responsibility is that elites bear the original sin, that of commission, whereas the public and the media commit sins of omission in failing to hold elites accountable for the misinformation. And hovering over this entire discussion is the role of social media and a recent President who exploited that medium to push misinformation to an unprecedented degree.

Elites and Elected Officials

In many conceptions of democracy, citizens are supposed to be able to rely on elites and other experts to provide the knowledge and cues they need (see GiIlens, 2012). The rationale behind this view is that the political world is far too complicated for the average citizen, but that need not be problematic as long as the general public is able to judge the “bearing of knowledge” supplied by elites and experts (see Chapter 6 of Dewey, 1927). This anticipates Zaller’s argument about citizens being able to use the conflict between competing elites to make decisions (1992). Much of this competition can come in the form of campaigns, which can increase political knowledge among citizens (Alvarez, 1997). And although citizens do have the potential to be deceived sometimes by disingenuous elites, Lupia and McCubbins find that, on the whole, citizens are quite selective about whom they choose to believe (1998, p.65).

These publications all share a similar trait–they are all too optimistic given the unprecedented (at least in the modern era) amount of misinformation being generated (and believed). I would counter these more hopeful findings thusly. First, there is no longer a “fair” competition between elites in the liberal pluralist use of the term. That is, that citizens can weigh the merits of the argument elites make in competition with one another. There are no longer common facts to argue (or at least fewer of them), as discussed in the previous section. And campaigns are no longer promoting the kind of learning that we would expect. The “debates” between candidates are farcical, the rhetoric used in the media acrimonious to an unprecedented degree, and the policy discussion non-existent. They have become bitter conflicts full of “alternate facts” and outright fabrications.

In this way, the competition of elites more starkly resembles the description Anthony Downs gave of political parties as a coalition of elites seeking to win power at any cost (1957). This now includes elites being willing to resort to the creation of this “post-factual” environment where they can, and do, say anything to appeal to voters, regardless of its accuracy. The problem, of course, is that we have an extensive literature in political science showing that the citizenry often follow elite opinion more or less on a metaphorical string. For example, Zaller (1992) persuasively shows how when politicians and elites agree on an issue, citizens think as one; when elites polarize, citizens (even ideological ones) do as well. Elites are also well-versed in using the media, and tend to have a disproportionate effect or even control of media (Hallin, 1994; Bennett, 1990). This is significant because the issues the media talks about tend to follow elite discourse (the so-called PMP model: politics–media–politics, as detailed by Wolfsfeld, 2014). Furthermore, the issues the media talk about are also the issues citizens come to feel are most important (Iyengar, Peters, & Kinder, 1982). Between their framing of issues and the power in the media, it is therefore unsurprising that Walter Lippmann concluded that the general public mostly only has the power to say yes or no to an issue presented to them by elites (1922). The responsibility elites bear to present genuine information, therefore, is paramount considering their pivotal position in the political system, and their failure to do so sets off the first domino in this game of misinformation.

Elected officials hold a special place in this discussion of elites as they have more direct incentives of trying to stay in office, and they can exploit human psychology through misinformation as a means of achieving that goal. Because humans have a built-in preference to avoid losses (Tversky & Kahneman, 1981), politicians who frame arguments in a way that emphasizes political losses appear more persuasive (Arceneaux, 2012). This is especially true of politicians who can do so while also inducing anxiety in their listeners (Arceneaux, 2012). This gives elected officials an incentive to engage in fear mongering. Apart from being normatively troubling on its own, it is especially concerning given that this fear mongering can now come from totally fabricated statements. The “Red Scare’’ hysteria, although excessive, was in some part justified because the Soviet Union represented a real existential threat to the United States, given their nuclear arsenal. The fear mongering by President Trump, on the other hand, was usually based on much more tenuous evidence. For example, he tried to create conditions of fear of Islamic extremism on the obviously false premise that"I watched in Jersey City thousands of thousands of people cheering…as that building went down [the World Trade Center]” (Washington Post, 2015).

This is a good segway to discuss the role of the President in preventing (or perpetuating) misinformation. Given that the power of the Presidency is greater than ever, the incentives are certainly there to provide misinformation. And although previously the President’s words were mainly filtered through the media, President Trump has shown that that is no longer strictly necessary given the potential of social media sites like X (formerly Twitter). President Trump was not alone in that strategy; Wolfsfeld and Tsifroni note that politicians all over the world are taking advantage of their new opportunities to reach constituents without the filter of traditional media (2018, p.226-227). These communications can be powerful in as much as Presidents are very persuasive to their co-partisan citizens. In testing Republican loyalty to President Trump specifically, Barber and Pope (2019) found that Republicans were more likely to follow President Trump’s lead even on liberal issues. This was true both of low-knowledge respondents and strong Republicans. Clearly, loyalty to the President can trump ideological fervour (no pun intended). That obviously creates opportunities for Presidents to make false claims and yet still pull their co-partisans along, as we saw most disturbingly with the election claims in 2020.

The Mass Public

Although I have already laid most of the blame on elites and their tendency to provide or perpetuate misinformation in order to create fearful conditions that promote support, the general public certainly bears some blame for failing to punish elites for spreading misinformation. In this section, I will discuss why the general public falls for the misinformation fed to it. This occurs because of a variety of reasons, of which I will briefly address the following: homogeneous networks, the online processing model, group-based policy computing, and ignorance.

Homogeneous Networks

As discussed earlier, cross-cutting networks are essential for republican democracy, but they are also relatively rare (Mutz, 2001). Kathy Cramer tells us that when people talk about politics, it is usually those with whom they are already firm social contacts such as family members and coworkers (2004, p.22). These contacts are typically like-minded in contemporary America. This means that citizens are less exposed to dissonant viewpoints, dissonant views which are essential in that they might challenge the misinformation they receive from their co-partisans. Mutz (2001) theorized that television news might provide those dissonant views, but we know that citizens, given greater media choices, are now selecting like-minded media more frequently than ever (Prior, 2007). This means that not only are challenges to any misinformation from co-partisan sources rarer, but there is potentially a double-dipping effect if citizens are also exposed to the same misinformation from sympathetic media sources parroting that misinformation. A Republican, for example, could read about supposed election fraud on X by President Trump (or hear it from a friend or family member who shares her political views), and then have that viewpoint solidified by hearing it on Tucker Carlson or seeing it online on her Facebook feed. It is a self-reinforcing cycle that makes misinformation more difficult to reject as it is coming from multiple avenues and from positions of trust.

Online Model

Having absorbed those false informational claims about the election, our hypothetical Republican then proceeds to forget all about them. Well, almost. Political information is not recorded in its entirety, but is updated “online” with affective tags (Lodge et al., 1989). Then, when called upon to make a decision (such as evaluating whether to vote for Trump in 2024), a “judgment counter” is retrieved and used to render an evaluation (Lodge et al., 1989, p.416). In this way, our hypothetical Republican can know whether they like or dislike Trump as a candidate in 2024, and have legitimate reasons for that evaluation, while not remembering the specific items of information that formed the basis for that evaluation. In her case, Trump is evaluated positively because her online tally of positive tags is overwhelming. All well and good, but the presence of misleading information could mean that citizens are just affectively coding bad information positively. If all the positive tally marks are just instances of belief in fraudulent claims made by President Trump, the overall evaluation will be made based on bad information. That was less problematic when citizens lived in less homogeneous social network structures, as at least they could be exposed to negative tally marks about candidates they liked from their social acquaintances.

Ignorance and Reliance on Groups

Although not always connected, in this case ignorance and group-reliance are joined. Ignorance, I am using not as a pejorative term, but as a description for the state of being of the overwhelming majority of Americans when it comes to political affairs. I would throw myself in that group too. Our modern political reality is far too subjective and complicated for citizens to make truly rational decisions (Lippmann, 1922). The average person is “caught in the sweep of forces too vast to understand or master” (Dewey, 1927, Chapter 4). There is just too much to know. As Lau and Redlawsk put it, “the widespread ignorance of the general public about all but the most highly salient political events and actors is one of the best documented facts in all of the social sciences” (2001, p.951). But, as discussed earlier, the ability of citizens to muddle through using various cognitive shortcuts (called heuristics) that allow them to act despite their ignorance (e.g. Graber, 1984) is questionable at best, since (given ignorance is the rule) low-information citizens tend to use their cognitive shortcuts rather poorly (Lau & Redlawsk, 2001). They latch on to heuristics like party identification without knowing enough about the issues and can end up voting “incorrectly”; voting against their preferences or voting opposite than they would have if they had had all the information available to them. This widespread ignorance is part of why citizens are not able to punish elites for presenting misinformation–they are not knowledgeable enough (or motivated enough) to tell the difference. The political world is much too complicated, parsing through lies too effortful, and the entertainment options too tempting to pass up. Why spend time fact-checking your local representative (presuming you know who that is) when the Great British Baking Show is one click away on your Roku remote?

But citizens are forced to make political decisions anyway. And so, in order to tame this overwhelmingly vast political information tide, citizens invariably end up just making political decisions based upon their social and group identities, including partisanship (Achen & Bartels, 2016, p.4). And it turns out that citizens are remarkably adept at accurately inferring the policy positions of many politically relevant groups in society (e.g. political parties, racial groups, and so forth) (Brady & Sniderman, 1985). Brady and Sniderman theorize that they are so accurate because of the use of something called a likability heuristic (1985). That is, they assess how they feel about an issue, and how they feel about a group, and use that information to estimate the group’s position (p.1075). So, if I’m asked about the ACLU’s position on immigration, and my own views of immigration is that there should be less of it, and I dislike the ACLU, then I can infer that the ACLU supports immigration. More troublingly, this logic also works with groups that actively spread misinformation. If I like QAnon, I can just as easily base my policy positions off of the conspiracy theories propagated on that social media feed. It is certainly less cognitively taxing.

In relying on these groups, including partisan groups, a clear double-standard is established (see Graham & Svolik, 2020, p.393). Republicans are only willing to punish Democratic candidates and vice versa, and this is exacerbated by partisanship. Furthermore, Graham and Svolik find that voters care little about executive aggrandizement and undemocratic behaviour. Although not tested by them specifically, this would presumably include the undemocratic behaviour of perpetuating or creating misinformation.

Thus, as Riker (1982) sadly concluded: “Popular rule is impossible but…citizens can exercise an intermittent, sometimes random, even perverse, popular veto on the machinations of political elites” (as quoted in Kuklinski & Peyton, 2007). Perhaps it is unfair, therefore, to expect citizens to police elite misinformation-sharing.

The Media

Of course, in an ideal world, the media would be lowering information costs so that the general public could more easily detect misinformation. This fits right in with Habermas’s description of what the media ought to be doing in serving as a watchdog for the welfare of citizens, providing dialogue across a diverse range of views, holding office holders accountable, and helping the public better understand their political environment (1996, p.378). That the media is failing to live up to those lofty standards would be unsurprising to most Americans–the Reuters report already cited finds that less than a third say they trust the news overall (2019). That they do fail is mostly a result of structural issues the industry is facing, some of them new, others identified as early as the 1920s by Walter Lippmann.

What’s new is the unprecedented fragmentation of the media as a whole, and the economic hardships they are facing. Since 2004, 1,800 newspapers in the U.S. have closed or merged, and about 1,300 U.S. towns and cities have lost local print news coverage completely (Newman, Fletcher, Kalogeropoulos, and Nielsen, 2019). And in the last ten years there has been a 25% reduction in the total number of journalists in the United States (Pew Research Center, 2019). Less journalists means less coverage ‘on the ground’. The falling number of journalists and their redistribution to urban areas has also been accompanied by a decline in investigative journalism. The primary reason is economic. Given the financial constraints media companies are facing, investigative journalism is too expensive an undertaking. In its place, more media companies are opting for ‘soft news’ with more human interest stories, ‘horse race politics’, ‘click bait’ headlines, celebrity journalists, and ’talking heads’ (Hamilton, 2010). This desperation for views incentivizes struggling media outlets to discuss misinformation (even when they are not generating it themselves) because it provokes a reaction, and hence eyeballs and monetized clicks.

What isn’t new is the mental and structural biases journalists face in reporting stories (as described by Lippmann, 1922). In fact, Lippman argues that we should not expect the press to be able to inform citizens in a reasonable way because they are a capitalist institution beholden to advertisers and readership for survival. This dependence on readership is especially problematic in terms of misinformation. Returning to our hypothetical Republican, Fox News may be unwilling to harshly repudiate any false claims by a Republican politician for fear of alienating their viewers who totally agree with the misinformation.

And there are the logistical concerns with the press–too few reporters with too little expertise being relied on to give an accurate picture of what’s going on in the world. Additionally, real reporting costs too much, and thus the media relies on news being provided to them through other actors (press releases would be one example). They can also only pick from a myriad of possible stories, and thus have an incentive to invest news stories with an emotional appeal to make them interesting (Lippmann, 1922). What’s more emotional than the fear-mongering false claims politicians can make? For these reasons, the news media is now poorly positioned to hold elites accountable for misinformation.

Social Media

While the traditional media may not be very good at policing misinformation, social media actively promotes it. The main reason is that citizens themselves are the sources of political information in this environment, spreading it and receiving it from their social networks. Unlike standard news media, information gained from one’s peers has no fact-checking. Information spread can be biased and “wildly inaccurate” from one’s social network (Carlson, 2019, p.326). This is especially true on social media, where the credibility cascades gained by multiple shares through one’s trusted peer network give even the most implausible stories a quick coating of credibility. This facilitates the spreading and legitimizing of conspiracy theories, which are not the realm only of political novices (Miller et al., 2016). And, as mentioned earlier, nearly half of the electorate holds at least one conspiracy theory (Oliver & Wood, 2014), and thus the ideologues or “ideal informants’’ that Converse (1964) and others have mentioned to save democracy likely include many of the same people sharing these false stories online. In fact, Carlson finds that these ideal informants are often quite badly mislabeled. People both routinely overestimate the expertise of their social ties, and when they do have opposing ideological ties, they avoid them to minimize"psychological discomfort” (p.338). Thus, between the like-minded “experts” in their own social networks, and avoidance of the experts who are not like-minded, the only impediment citizens’ face in spreading misinformation is finger fatigue from hitting the share button so many times.

How do we fight the misinformation problem?

Given that I’ve just laid out the causes and consequences of our society’s misinformation problems, one would think the solutions would be clear and easy. They are clear to an extent because we know what factors (or at least know many of the factors) that mitigate misinformation, but the implementation of these factors is certainly anything but easy. The solutions I suggest are greater cross-cutting discourse, more citizen learning, and more regulation of information.

Cross-cutting Discourse and Citizen Learning

I have already discussed the importance of cross-cutting discourse as a potential dampener for spreading and holding misinformation several times. Discussion and exposure to contrary views in a respectful environment promotes tolerance of diversity and a regard for the claims and interests of others (Mutz, 2002, p.112). Frank and open dialogue with someone who does not hold the same misinformed views as oneself can promote more deliberative thinking about the veracity of one’s own information (Bakshy et al., 2015). The problem with this approach, of course, is that the onus is on the individual citizen. They have to want to broaden their political horizons and take the uncomfortable step of engaging in conversation with those whom they disagree politically. To aid in that somewhat difficult task, I believe government (especially at the local level) should invest in opportunities for more “local town hall” meeting events. This is the fundamental problem Simon Dewey identified with our modern democracy; it has left behind the local discursive centers like Town hall meetings in which our society and its laws were founded (1927). He believed that in order for democracy to work, a person needs to know and understand their neighbours, including those with heterogeneous views. That, obviously, has become more difficult in today’s society than in the 19th century.

Besides contact, the other avenue is to try improve citizen learning so that they gain the political sophistication necessary to differentiate between misinformation and good information. The list Carpini and Keeter (1996, p.272) suggest to improve citizen learning includes the following:

  1. A media who increases the quality and access of information
  2. More civics classes that continue later in life
  3. Campaign reform that improves quality of information
  4. Citizens join community associations.
  5. Target currently disengaged citizens.

Of course, these suggestions are rather broad (“fix the media”), but they do provide a roadmap of sorts. I will make just a few comments on what I see as the most promising of the suggestions. Their suggestion of continuing civics classes is an intriguing (if unlikely) one. Imagine if attending a civics class were like jury duty–just another obligation of citizenship. That would provide opportunities for the political sophisticates (who don’t need the classes) to rub shoulders with the novices, and perhaps facilitate learning to a degree in a more low-stakes environment.

Campaign reform is another broad suggestion that I think has a lot of promise. One of the more pressing ones, I believe, is to improve upon the debate format that happens in campaigns. There are three key changes I would like to implement to improve the experience for listeners. Firstly, there needs to be a mute button on the microphone of candidates so they cannot interrupt each other, and thus prevent citizens from hearing each candidate fairly. This was implemented in the 2024 Presidential Debates, and should continue to be used. Secondly, journalists must press the candidates to answer the questions as asked rather than allowing them to evade the questions or go off on a tangent. That just requires persistence on the part of the journalist and an expectation of the candidates that they will do so, before agreeing to debate. And thirdly, real-time (or near real-time) fact-checking from an independent source should be possible given technological advances. I would like a running tally (similar to how ESPN displays moving information on the “ticker” at the bottom of their programs) showing citizens both the proportion of misleading answers, and the specific instances of lying. Although citizens can choose to disbelieve fact-checkers, that at least creates an environment where the candidates are aware that their answers are being checked for misinformation in near real-time and being shown to the citizenry.

More Regulation of Media?

The “fix the media” approach is undoubtedly the most popular call to solve the misinformation problem, but it is the most difficult and presents the most trade offs. It is true, however, that much of the misinformation that flourishes does so because of the fragmented, laissez-faire media in the United States. Anyone can view whatever information corresponds to their political preferences, and there’s no real penalty for spreading misinformation. Even well-established media outlets like Fox News have spread misinformation about climate change, electoral coverage, and more, and that is not mentioning the host of fringe media companies that are allowed to operate in our very open marketplace.

Contrast this with the constrained media environment of the 1950s to 1980s, where political consensus was created by the fact that all the citizens were essentially viewing the same media (on four or five different channels). In this more constrained environment, the penalties for spreading misinformation were much more severe. And although we cannot return to that environment completely, it would be theoretically possible to raise the standards of what is required to operate a media outlet. For example, one solution might be to only allow reporters with a journalism degree to be allowed to write stories or appear on air, as at least they are generally indoctrinated to adhere to the so-called objectivity norm (see Tuchman, 1972), and to maintain a certain standard of quality. Another solution might be to require media companies to adhere to a fact-checking process, and display the results of that fact-checking along with their stories. Sure, consumers could ignore that, but they might at least pause if a story is flagged as misinformation.

Of course, It is highly possible (probable?) that the cure to misinformation in this case is worse than the disease. It feels very un-American, although one should remember that Federalist Paper 63 maintains that it may be necessary to occasionally protect the people “against their own temporary errors and delusions.” Various experts have struggled with how best to do this protecting from “errors and delusions.” Walter Lippman’s solution was a team of experts being attached to each department in the government to report on their activities (1922). He recognized “the need for interposing some form of expertness between the private citizen and the vast environment in which he is entangled” (p.238), and he was skeptical of the ability of the press to do this. As he put it, “public opinion must be organized for the press…not by the press” (p.19).

Simon Dewey too had some role for experts. He wanted experts for “discovering and making known the facts” for the public rather than deciding on policy for them (1927, p.208). The question is: how would these policies help in our current era of misinformation? I like Lippman’s solution, although the results would be a series of dry reports from various bureaucratic agencies that would never be read by the general public. There is a reason the media don’t report on this already–it is too uninteresting to the average citizen. Nobody will ever care what the Office of Budget and Management is doing unless there is some unlikely budgeting scandal. But at least it would provide a baseline of factual information that inquiring citizens could use if they wanted.

To conclude this section, there are no easy answers to systematically combating misinformation that do not also require a higher level of regulation than most Americans are (perhaps justifiably) comfortable with. The solutions proposed here try to operate in that middle ground, but are probably too feeble to stem the raging flow of misinformation. I fear however that trying to vigorously to stem the flow would cut off the water supply completely!

The role of Political Science in Combatting Misinformation

Political science has mostly played a descriptive role up to this point in the fight against misinformation. There are numerous studies, some cited in this paper, detailing the problems and causes of misinformation. There are fewer studies, however, presenting a more normative angle of how misinformation ought to be handled. I mentioned Dewey and Lippman’s more idealistic suggestions (return to smaller communities and attached informational bodies). Part of the reason for this dearth is that misinformation has only been a truly serious issue (at least in the modern era), in my opinion, since the Obama presidency, which coincided with the soaring availability of smartphones and more ubiquitous social media. The time horizon for this as a truly serious problem has been so recent that I am not surprised that political scientists are playing catch-up at this point. There has been, however, a recent spate of articles in the wake of popular discussion of fake news during the Trump presidency (e.g. Brown, 2018; Ecker & Ang, 2019; Allcott, Gentzkow, & Yu, 2019). In a review of the scholarly literature up to this point, Jerit and Zhao (2020), also find that recommendations for correcting the misinformation problem have been too scattered at this point. There has been no “accumulation of science” that has developed (finding on finding) in that regard. Instead, each researcher (or at least those who do offer normative suggestions rather than just observations or description) offers a rather haphazard list of suggested solutions, none of which may be particularly feasible. If that sounds familiar it is because that is precisely the strategy employed in this essay! The most basic and common of these solutions has been to offer “correction” in some form, although much of the literature has focused on the potential boomerang effects that can accompany correction (see for example Bullock, 2007; Nyhan & Reifler, 2010; Nyhan & Reifler, 2015).

Jerit and Zhao also offer a couple of other commonalities in the literature. Firstly, the fact that most commentators do at least include a section about the danger this poses to deliberative or republican democracy. I would argue that, in fact, there has been an over-emphasis on republican democracy and a neglect of other models. The connection between misinformation and republican democracy is clearer given the negative role misinformation has on the very notion of deliberation or “the best argument wins”, but I have long doubted whether a republican version of democracy accurately describes our system at all. The elite model and liberal-pluralist model (as described by Baker (2002)) seem to fit the bill of contemporary American democracy better. How big a problem misinformation poses to those conceptions of democracy is an open question mostly ignored in the literature. The liberal-pluralist model, which does not assume that public interest arises from deliberation, but rather is latent until expressed by the banding together of large interest groups seems like it would fare slightly better in terms of misinformation. It changes the focus from deliberation onto groups. Of course, misinformation about political policies affecting your group would still be possible.

Likewise, an elite model seems more feasible as well, but not foolproof. In this conception of democracy, society is ruled by elites, and the public simply has to pick which elite (or groups of elites) will represent their interests. Again, deliberation is not strictly necessary in this model; interests are given to the public by elites. Of course, this opens the door for elites to provide misinformation that ensures the public chooses their “party of elites” over the other party, but I do think the lack of a deliberation element is helpful in terms of mitigating the problem of misinformation. Neither of these alternative models are perfect in terms of solving misinformation, but they have been under-explored in the literature.

The second observation of Jerit and Zhao (2020, p.87) is that there is often a fine line that is crossed between what is ignorance, what is misinformation, and what might simply be partisan cheerleading. On this last point, the methods employed in most papers of misperception involve survey questions asking about factual information. There is a possibility, however, that survey respondents know that their answers are technically wrong but want their “side” to look good to the researcher. Thus, their answers would reflect partisan cheerleading more than misinformation. I am less concerned with that, however, in that citizens express plenty of beliefs that are more aligned to conspiracy theories, and thus are harder to fit into that category of partisan cheerleading. It is less reasonable to assume that a respondent is partisan cheerleading by responding on an open-ended survey that Jews control the world economy or that planes are leaving dangerous chemical trails, or any other conspiracy theory.

What should be the role of Political Science moving forward in regard to misinformation?

Given that I believe that misinformation (and the polarization and hatred it causes) is one of the greatest threats to our democracy, I believe political scientists need to be more active in detailing solutions to this problem. In 2014, near the beginning of this proliferation of extreme misinformation, Jane Mansbridge addressed the American Political Science Association. In her address, she posed the question: what is political science for? And answered it by saying that the purpose of political science ought to be to learn to govern ourselves better. And part of that learning is to focus on how governments can create legitimate coercion in order to solve many of the problems besetting our society. I couldn’t agree more.

Governing ourselves better implies doing more than realizing that misinformation is poisoning our democracy; it is offering real and feasible solutions. The legitimate coercion point Mansbridge raises is especially insightful. She notes that our political system is founded on, and preoccupied with, a prevention of tyranny (p.9). This has developed into a “resistance tradition” in political science rooted in contract theory and which primarily worries about government overreach in curtailing freedoms. While that is a noble tradition, it was developed in an environment in which personal honour and reputation for politicians and elites was paramount, in which media options were more limited, and in which citizens were not expected to play any meaningful role. That does not describe our situation today. Given what we know about motivated reasoning, about misinformation, and about citizens’ general apathy about democracy, it is crucial that we do something to counter the growing acceptance of misinformation.

Kuklinski et al.(2020) have a good suggestion here–hit citizens “right between the eyes.” “Unless they are"hit between the eyes” with the right facts, they continue to judge policy on the basis of their mistaken beliefs" (p.810). Unfortunately, a follow up survey showed that people often returned to their original beliefs, but at least temporary gains were made, and it is possible more persuasive forms of correction can be found in future. As part of that process, I would like to see more outreach from political scientists to the general public. The vast majority of political science work sits unread in electronic journal databases that are inaccessible to the general public, and which rarely get read by political scientists themselves! This topic, however, is too important for that kind of limited outreach. Political scientists need to be publishing their findings and recommendations about misinformation in more popular media forms. They need to be appearing on public television programs. They need to be hosting podcasts, giving TED talks, and holding public forums. There are many ways of reaching the general public, but political scientists are not making an effort to do so. Perhaps the public will ultimately be uninterested, but it should not be for lack of trying on the part of political scientists.

Bibliography

Allcott, H., Gentzkow, M., & Yu, C. (2019). Trends in the diffusion of misinformation on social media. Research & Politics, 6(2). https://doi.org/10.1177/2053168019848554

Analysis | Trump’s outrageous claim that ’thousands’ of New Jersey Muslims celebrated the 9/11 attacks. (2015). Washington Post. Retrieved August 18, 2021, from https://www.washingtonpost.com/news/fact-checker/wp/2015/11/22/donald-trumps-outrageous-claim-that-thousands-of-new-jersey-muslims-celebrated-the-911-attacks/

Arceneaux, K. (2012). Cognitive Biases and the Strength of Political Arguments. American Journal of Political Science, 56(2), 271–285.

Au, C. H., Ho, K. K. W., & Chiu, D. K. W. (2021). The Role of Online Misinformation and Fake News in Ideological Polarization: Barriers, Catalysts, and Implications. Information Systems Frontiers. https://doi.org/10.1007/s10796-021-10133-9 Baker, Edwin. 2002. “Media, Markets, and Democracy.” Pp. 125-153. Bawn, K., Cohen, M., Karol, D., Masket, S., Noel, H., & Zaller, J. (2012). A Theory of Political Parties: Groups, Policy Demands and Nominations in American Politics. Perspectives on Politics, 10(3), 571–597. https://doi.org/10.1017/S1537592712001624

Bartels, L. M. (2002). BEYOND THE RUNNING TALLY: Political Behaviour 24(2).

Benier, K., & Wickes, R. (2016). The effect of ethnic diversity on collective efficacy in Australia. Journal of Sociology, 52(4), 856–873. https://doi.org/10.1177/1440783315599595

Bennett, W. L. (1990). Toward a Theory of Press-State Relations in the United States. Journal of Communication, 40(2), 103–127. https://doi.org/10.1111/j.1460-2466.1990.tb02265.x

Blumler, Jay and Michael Gurevitch. 1995. “Politicians and the Press: An Essay on Role Relationships.” In The Crisis of Public Communication. New York: Routledge.

Bode, L, & Vraga, E.K. (2015) In Related News, That was Wrong: The Correction of Misinformation Through Related Stories Functionality in Social Media, Journal of Communication, 65(4), 619–638.

Brady, H. E., & Sniderman, P. M. (1985). Attitude Attribution: A Group Basis for Political Reasoning. The American Political Science Review, 79(4), 1061–1078. https://doi.org/10.2307/1956248

Brown, E. (2018). Propaganda, Misinformation, and the Epistemic Value of Democracy. Critical Review, 30(3–4), 194–218. https://doi.org/10.1080/08913811.2018.1575007

Bullock JG. 2007. Experiments on partisanship and public opinion: party cues, false beliefs, and Bayesian updating. PhD Thesis, Stanford University, Stanford, CA

Carlson, T. N. (2019). Through the Grapevine: Informational Consequences of Interpersonal Political Communication. American Political Science Review, 113(2), 325–339. https://doi.org/10.1017/S000305541900008X

Carpini, D., & Keeter, S. (1996). What Americans know about politics and why it matters . Yale University Press.

Chong, D., & Druckman, J. N. (2007). A Theory of Framing and Opinion Formation in Competitive Elite Environments. Journal of Communication, 57(1), 99–118. https://doi.org/10.1111/j.1460-2466.2006.00331.x

Conover, P. J., & Feldman, S. (1986). Emotional Reactions to the Economy: I’m Mad as Hell and I’m not Going to Take it Anymore. American Journal of Political Science, 30(1), 50–78. https://doi.org/10.2307/2111294

Coronel, J. C., Duff, M. C., Warren, D. E., Federmeier, K. D., Gonsalves, B. D., Tranel, D., & Cohen, N. J. (2012). Remembering and Voting: Theory and Evidence from Amnesic Patients. American Journal of Political Science, 56(4), 837–848.

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, H. E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113(3), 554–559. https://doi.org/10.1073/pnas.1517441113

Dewey, John. 1927. “The Public and Its Problems.”

Douds, K., & Wu, J. (2018). Trust in the Bayou City: Do Racial Segregation and Discrimination Matter for Generalized Trust? Sociology of Race and Ethnicity, 4(4), 567–584. https://doi.org/10.1177/2332649217717741

Ecker, U. K. H., & Ang, L. C. (2019). Political Attitudes and the Processing of Misinformation Corrections. Political Psychology, 40(2), 241–260. https://doi.org/10.1111/pops.12494

Editor-at-large, CNN. (2021). Analysis: 1 in 3 Americans believe the “Big Lie.” CNN. Retrieved August 18, 2021, from https://www.cnn.com/2021/06/21/politics/biden-voter-fraud-big-lie-monmouth-poll/index.html

Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs About Politics: Nature and Origins of Misperceptions. Political Psychology, 38, 127–150. https://doi.org/10.1111/pops.12394.

Freedman, P., Franz, M., & Goldstein, K. (2004). Campaign Advertising and Democratic Citizenship. American Journal of Political Science, 48(4), 723–741. https://doi.org/10.2307/1519930

Gaines, B. J., Kuklinski, J. H., Quirk, P. J., Peyton, B., & Verkuilen, J. (2007). Same Facts, Different Interpretations: Partisan Motivation and Opinion on Iraq. The Journal of Politics, 69(4), 957–974. https://doi.org/10.1111/j.1468-2508.2007.00601.x

Gibson, James L. 1988. “Political Intolerance and Political Repression During the McCarthy Red Scare.” American Political Science Review 82(2): 511–29.

Gilens, M. (2012). Affluence and Influence: Economic Inequality and Political Power in America. Princeton University Press. https://doi.org/10.1515/9781400844821

Glas, I., Jennissen, R., & Engbersen, G. (n.d.). Estimating Diversity Effects in the Neighborhood: On the Role of Ethnic Diversity and Out-group Size and their Associations with Neighborhood Cohesion and Fear of Crime. Social Indicators Research. https://doi.org/10.1007/s11205-021-02704-9

Graber, D.A. (1984). Processing the News. Longman Professional Studies in Political COmmunication and Policy.

Graham, M. H., & Svolik, M. W. (2020). Democracy in America? Partisanship, Polarization, and the Robustness of Support for Democracy in the United States. American Political Science Review, 114(2), 392–409. https://doi.org/10.1017/S0003055420000052

Gundelach, B., & Manatschal, A. (2017). Ethnic Diversity, Social Trust and the Moderating Role of Subnational Integration Policy. Political Studies, 65(2), 413–431.

Habermas, J. (1996) Between Facts and Norms. MIT Press. Cambridge, Massachusetts. Hallin, Daniel C. 2004. Comparing Media Systems Three Models of Media and Politics /. Cambridge; Cambridge University Press. Hamilton, J.T. (2010) The (Many) Markets for International News. Journalism Studies, 11:5, 650-666.

Hameleers, M., & van der Meer, T. G. L. A. (2020). Misinformation and Polarization in a High-Choice Media Environment: How Effective Are Political Fact-Checkers? Communication Research, 47(2), 227–250. https://doi.org/10.1177/0093650218819671

Hindman, M.S. (2019) The internet trap :how the digital economy builds monopolies and undermines democracy.

Holt. Dinesen, P. T., Schaeffer, M., & Sonderskov, K. M. (2020). Ethnic Diversity and Social Trust: A Narrative and Meta-Analytical Review. In M. Levi & N. L. Rosenblum (Eds.), Annual Review of Political Science, Vol 23 (Vol. 23, pp. 441–465). Annual Reviews

Hovland, Carl I., Arthur A. Lumsdaine, and Fred D. Sheffield. Experiments on Mass Communication [Chapter 1 and Chapter 3] (1949).

Iyengar, S., Peters,M.D., & Kinder, D.R. (1982) “Experimental Demonstrations of the ‘Not-So-Minimal’ Consequences of Television News Programs”.

Jerit, J., & Zhao, Y. (2020). Political Misinformation. Annual Review of Political Science, 23(1), 77-94.

Kalla, J. L., & Broockman, D. E. (2018). The Minimal Persuasive Effects of Campaign Contact in General Elections: Evidence from 49 Field Experiments. American Political Science Review, 112(1), 148–166. https://doi.org/10.1017/S0003055417000363

Klapper, J.T. (1960). The Effects of Mass Communication. The Free Press. New York.

Kongshoj, K. (2019). Trusting diversity: Nationalist and multiculturalist orientations affect generalised trust through ethnic in-group and out-group trust. Nations and Nationalism, 25(3), 822–846. https://doi.org/10.1111/nana.12505

Kuklinski, J. H., Quirk, P. J., Jerit, J., Schwieder, D., & Rich, R. F. (2000). Misinformation and the Currency of Democratic Citizenship. The Journal of Politics, 62(3), 790–816.

Kuklinski, J. H., & Peyton, B. (2007). Belief Systems and Political Decision Making. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199270125.003.0003

Kull, S., Ramsay, C., & Lewis, E. (2003). Misperceptions, the media, and the Iraq War. Political Science Quarterly, 118(4), 569–598.

Kunda, Z. 1990. “The Case for Motivated Reasoning.” Psychological Bulletin 108(3):480–98.

Lau, R. R., & Redlawsk, D. P. (2001). Advantages and Disadvantages of Cognitive Heuristics in Political Decision Making. American Journal of Political Science, 45(4), 951–971. https://doi.org/10.2307/2669334

Lippmann, Walter. 1922. “Public Opinion.” Harcourt, Brace, & Co.

Lodge, M., McGraw, K. M., & Stroh, P. (1989). An Impression-Driven Model of Candidate Evaluation. The American Political Science Review, 83(2), 399–419. https://doi.org/10.2307/1962397

Loomba, S., de Figueiredo, A., Piatek, S.J. et al. (2021) Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nat Hum Behav 5, 337–348. https://doi.org/10.1038/s41562-021-01056-1.

Lorenz, T. (2021, August 1). To Fight Vaccine Lies, Authorities Recruit an ‘Influencer Army.’ The New York Times. https://www.nytimes.com/2021/08/01/technology/vaccine-lies-influencer-army.html

Lovari, A. (2020). Spreading (Dis)Trust: Covid-19 Misinformation and Government Intervention in Italy. Media and Communication, 8(2), 458–461. https://doi.org/10.17645/mac.v8i2.3219

Loxbo, K. (2018). Ethnic diversity, out-group contacts and social trust in a high-trust society. Acta Sociologica, 61(2), 182–201. https://doi.org/10.1177/0001699317721615

Lupia, A. & Mccubbins, M.. (1998). The Democratic Dilemma: Can Citizens Learn What They Need to Know? Cambridge University Press.

MacKuen, M., Wolak, J., Keele, L., & Marcus, G. E. (2010). Civic Engagements: Resolute Partisanship or Reflective Deliberation. American Journal of Political Science, 54(2), 440–458.

Mansbridge, J. (2014). Presidential Address: What Is Political Science For? Perspectives on Politics, 12(1), 8–17.

Meir, D., & Fletcher, T. (2019). The transformative potential of using participatory community sport initiatives to promote social cohesion in divided community contexts. International Review for the Sociology of Sport, 54(2), 218–238.

Mendelberg, Tali. 1997. “Executing Hortons: Racial Crime in the 1988 Presidential Campaign.” Public Opinion Quarterly 61 (Spring): 134-5

Miller, J. M., Saunders, K. L., & Farhart, C. E. (2016). Conspiracy Endorsement as Motivated Reasoning: The Moderating Roles of Political Knowledge and Trust. American Journal of Political Science, 60(4), 824–844.

Miller, W. E., & Stokes, D. E. (1963). Constituency Influence in Congress. The American Political Science Review, 57(1), 45–56. https://doi.org/10.2307/1952717

Mintchev, N., & Moore, H. L. (2018). Super-diversity and the prosperous society. European Journal of Social Theory, 21(1), 117–134.

Mutz, D. C. (2002). Cross-Cutting Social Networks: Testing Democratic Theory in Practice. The American Political Science Review, 96(1), 111–126.

Oliver, J. E., & Wood, T. J. (2014). Conspiracy Theories and the Paranoid Style(s) of Mass Opinion. American Journal of Political Science, 58(4), 952–966. https://doi.org/10.1111/ajps.12084 Putnam, R. D. (2007). E pluribus unum: Diversity and community in the twenty-first century the 2006 Johan Skytte Prize Lecture. Scandinavian Political Studies, 30(2), 137–174. https://doi.org/10.1111/j.1467-9477.2007.00176.x

Nyhan B. 2010. Why the “death panel” myth wouldn’t die: misinformation in the health care reform debate. Forum 8(1):5 Nyhan B, Reifler J. 2015b. Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine 33:459–64 Prior, Markus. 2007. Post-Broadcast Democracy: How Media Choice Increases Inequality in Political Involvement and Polarizes Elections /. New York: Cambridge University Press.

Reuters Institute Digital News Report 2019. (2019). 156.

Rudolph T, Popp E. 2010. Race, environment, and interracial trust. J. Politics 72(1):74–89

Schmidt, A. L., Zollo, F., Scala, A., Betsch, C., & Quattrociocchi, W. (2018). Polarization of the vaccination debate on Facebook. Vaccine, 36(25), 3606–3612. https://doi.org/10.1016/j.vaccine.2018.05.040

Stroud, N. J. (2008). Media Use and Political Predispositions: Revisiting the Concept of Selective Exposure. Political Behavior, 30(3), 341–366. https://doi.org/10.1007/s11109-007-9050-9

Taber, C. S., & Lodge, M. (2006. Motivated Skepticism in the Evaluation of Political Beliefs. American Journal of Political Science, 50(3).

Thompson-Hernández, W., & Reefer, K. (2020, June 9). Evoking History, Black Cowboys Take to the Streets. The New York Times. https://www.nytimes.com/2020/06/09/us/black-cowboys-protests-compton.html

Thorson, E. (2016). Belief echoes: The persistent effects of corrected misinformation. Political Communication, 33, 460-480. doi:10.1080/10584609.2015.1102187.

Tuchman, Gaye. 1972. “Objectivity as Strategic Ritual: An Examination of Newsmen’s Notions of Objectivity.” The American Journal of Sociology 77(4): 660–79.

Tversky, A., & Kahneman, D. (1981). “The Framing of Decisions and the Psychology of Choice.” Science 211 (4481): 453-38.

Valentino, N. A., Hutchings, V. L., & White, I. K. (2002). Cues That Matter: How Political Ads Prime Racial Attitudes during Campaigns. The American Political Science Review, 96(1), 75–90.

Van Aelst, P., Strömbäck, J., Aalberg, T., Esser, F., de Vreese, C., Matthes, J., Hopmann, D., Salgado, S., Hubé, N., Stępińska, A., Papathanassopoulos, S., Berganza, R., Legnante, G., Reinemann, C., Sheafer, T., & Stanyer, J. (2017). Political communication in a high-choice media environment: A challenge for democracy? Annals of the International Communication Association, 41(1), 3–27. https://doi.org/10.1080/23808985.2017.1288551

Van Staveren, I., & Pervaiz, Z. (2017). Is it Ethnic Fractionalization or Social Exclusion, Which Affects Social Cohesion? Social Indicators Research, 130(2), 711–731. Watters, S. M., Ward, C., & Stuart, J. (2020). Does normative multiculturalism foster or threaten social cohesion? International Journal of Intercultural Relations, 75, 82–94.

Wong, C., Bowers, J., Williams, T., & Simmons, K. D. (2012). Bringing the Person Back In: Boundaries, Perceptions, and the Measurement of Racial Context. The Journal of Politics, 74(4), 1153–1170. https://doi.org/10.1017/S0022381612000552 Wehsener, A. Digital Threats to Democracy: Comfortably Numb. (n.d.). Retrieved August 18, 2021, from https://www.cnas.org/publications/commentary/digital-threats-to-democracy-comfortably-numb

Wolfsfeld, G. (2004). Media and the Path to Peace. Cambridge University Press.

Wolfsfeld, Gadi, and Linor Tsifroni. 2018. “Political leaders, media and violent conflict in the digital age.” In Media in war and armed conflict: The dynamics of conflict news production and dissemination, edited by Romy Frölich, 218‐242. New York: Routledge.

Zaller, J. (1992). The Nature and Origins of Mass Opinion (Cambridge Studies in Public Opinion and Political Psychology). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511818691.