White Paper: How Emotional Vulnerability Leads To Health Misinformation And What Leaders Can Do About It.
Executive Summary
The battle against health misinformation isn’t just about correcting facts, it's about addressing emotions. For years, health communication has leaned heavily on factual and rational persuasion. However, emerging research reveals that even when information is accurate, audiences can reject it.
Research suggests that the reason for this lies not in what people know, but in what they feel. While analytical thinking does help resist some falsehoods, it is emotional dysregulation or the difficulty in managing negative emotions that drives susceptibility to conspiracy theories and misinformation.
This white paper draws on original research exploring the relationship between analytical thinking, emotional dysregulation and conspiracy belief. The findings are clear: emotion, not reason, is the stronger predictor of susceptibility.
For health leaders, this has major implications. Communications strategies that fail to consider emotional states leave audiences exposed to false but comforting narratives. Misinformation flourishes in this gap, not because people are uninformed but because they’re emotionally unanchored.
To lead effectively in this landscape, health brands and institutions must move beyond fact correction and adopt emotionally intelligent communication practices. This paper outlines why this shift is necessary and how to begin.
The New Reality: Facts Alone Don’t Win Minds
Health communication strategies have largely focused on correcting misinformation through data, evidence and rational argument. The assumption has been that if people believe something false, it would be because they don’t have the right information.
However, evidence no longer supports this approach. Conspiracy theories and health misinformation continue to thrive in an age where information has never been more available. Even when the facts are accessible, they are often ignored, rejected or reinterpreted.
This points to a deeper issue, and one that may not be solved by better infographics or clearer messaging. As research now shows, beliefs in conspiracy theories are not necessarily a failure of reasoning. They are more likely rooted in emotional responses to a sense of disruption, vulnerability or uncertainty. When people are unable to manage their emotions (fear, anxiety or confusion), they are far more likely to latch onto simplified, emotionally laden narratives that are easier to grasp and provide an immediate sense of relief and control.
What makes this particularly challenging for health leaders is that conspiracy beliefs are resistant to correction. Once accepted, they tend to be reinforced by group identity, confirmation bias and mistrust of institutions. Attempts to refute them with more data often have little effect or backfire (Lewandowsky et al., 2013). This is especially true when audiences are already emotionally dysregulated. In that state, the brain more likely to default to quick, intuitive processing, known as System 1 thinking, instead of the slower, more reflective reasoning of System 2 (Kahneman, 2011; Evans, 2008).
The issue is not that people don’t know enough. It’s that they’re overwhelmed, emotionally unregulated and neurologically primed to reach for fast, emotionally satisfying explanations, regardless of accuracy.
This reality calls for a new approach to health communications. An approach that understands that emotion, not reason, often decides what people believe.
What Drives Health Beliefs: Emotion Over Reason
Health misinformation has largely been viewed through the lens of cognitive failure. The assumption has been that if people could think more critically or reflect more deeply, they would be less susceptible to false or conspiratorial narratives. This belief has shaped everything from public health campaigns to social media health communication strategies.
And yet, this view only tells half the story.
Analytical thinking does have a protective effect; it can help override gut reactions and identify logical inconsistencies, but its influence appears limited. As the evidence from this study shows, the ability to think critically does correlate with a lower belief in conspiracy theories. But crucially, it's not the strongest predictor.
The stronger predictor is emotional dysregulation, the inability to manage or even recover from negative emotional states. In emotionally dysregulated individuals, belief in conspiracy theories is significantly more likely to take hold, regardless of analytical ability.
This finding aligns with the wider psychological literature. When people experience distress, their cognitive systems default to what is known as System 1 processing: fast, intuitive, emotionally driven thinking that relies on mental shortcuts (Kahneman, 2011; Evans, 2008). In contrast, System 2 processing is slow, deliberate reasoning. It requires attention, working memory and emotional regulation. When System 1 resources are compromised in times of stress or overwhelm, System 2 becomes harder to access.
Emotionally dysregulated individuals are also more likely to focus on threatening stimuli, exaggerate perceived risks and struggle to tolerate ambiguity. In this state, conspiracy theories can feel less like irrational beliefs and more like coping strategies. They become narratives that simplify the chaos, offer agency and soothe psychological discomfort. As past studies show, this emotional vulnerability creates a readiness to adopt explanations that provide closure, even if they contradict credible information (Douglas et al., 2017; Van Prooijen & Douglas, 2018; Scandurra et al., 2022).
Put simply, people often believe not because a story is true, but because it helps them feel better. And when the emotional load is high, even intelligent and educated individuals are vulnerable to these beliefs.
This highlights a clear but under-recognised truth in health communication: information quality matters, but emotional state matters more.
Emotional Vulnerability: The Hidden Risk Factor
Emotional vulnerability isn’t always visible. But in the context of health misinformation, it plays a powerful role in shaping what people believe and how resistant they are to change.
In psychological terms, emotional dysregulation refers to the difficulty individuals have in managing their emotional responses. It can manifest as heightened reactivity, poor recovery from stress or persistent negative mood states. Practically, it’s the experience of being emotionally overwhelmed and struggling to step back from fear, anxiety or confusion to assess a situation.
This vulnerability can significantly impair how people process information. When someone is emotionally dysregulated, their attention narrows. They become more attuned to threat, more likely to seek certainty, and more reactive to perceived injustice or loss of control. These emotional conditions make conspiracy theories particularly appealing as they provide the promise of clarity, the reassurance of blame and an illusion of control.
The problem is, this susceptibility often goes unnoticed in health communication. Campaigns are typically designed around the assumption that if information is clear, accurate and well-distributed, it will be accepted. But as this research shows, audiences don’t process information in a vacuum. Their ability to engage with a message depends on their emotional state at the time they receive it.
This has real consequences for health communication. When emotional vulnerability is ignored, even well-intentioned messaging can fall flat or, worse, provoke resistance. People who feel overwhelmed are less likely to absorb complex health guidance and more likely to default to emotionally charged narratives that feel simpler and more psychologically rewarding.
Importantly, emotional vulnerability is not confined to any one demographic. It cuts across age, education and socioeconomic status. As this research suggests, even individuals with strong analytical reasoning skills are at risk if they are emotionally dysregulated. This challenges the idea that susceptibility to misinformation is an issue of intelligence or education. Instead, it reframes it as an issue of emotional capacity.
For health communicators, this insight is critical. It highlights the need to move beyond knowledge-based models and begin considering emotional readiness. That is, how psychologically equipped is someone to engage with complex or uncomfortable truths.
Why Traditional Health Messaging Falls Short
The traditional health messaging strategy follows a precise knowledge-based model: deliver accurate information, debunk falsehoods and trust that clarity will lead to compliance. This approach overlooks how people receive and interpret the messages, especially during periods of emotional strain, when they may need it the most. This creates a disconnect.
Emotionally unanchored audiences don’t necessarily reject messages because they lack clarity. They don’t connect with messages because the message doesn’t meet them where they are, psychologically or emotionally.
Another limitation is that messaging strategies focus on what to say over how it is received. The framing, tone and delivery of a message often carry as much weight as the content itself, especially when people are already overwhelmed. Without this attunement, good evidence-based guidance runs the risk of being misread as condescending or reinforcing the very mistrust it’s trying to dismantle.
To move forward, health communications need more than factual accuracy. It needs emotional intelligence: the ability to anticipate how a message will land, and design it in ways that create both clarity and psychological safety.
A Blueprint for Emotionally Intelligent Health Communication
If health communication is to be more effective, it must be designed with emotional readiness in mind. This means moving beyond how a message is delivered to how it is likely to be received, especially under conditions that range from daily life stressors to deeper emotional dysregulation.
The tone matters, but so does context. Different types of messages require different emotional strategies. A public health alert during a pandemic calls for a different approach than a wellness message promoting lifestyle change. The former may need to steady fear without escalating it, while the latter must build trust and motivation without overwhelming.
Effective communication relies on attunement. That is, it relies on understanding the audience’s emotional state and the psychological demands of the message itself. The aim isn’t to soften the facts, but to reduce the resistance so the message can be heard, processed and absorbed.
One valuable strategy is to incorporate cognitive reappraisal. This technique alters the framing of how a message is conveyed and understood in a more constructive light. Instead of saying “what’s not working," the message might shift to “what you can do now?” This change lowers defensiveness and encourages engagement without diluting the core message.
Sequencing also shapes how a message is perceived. Leading with emotional acknowledgement before introducing complex or corrective information can create psychological safety and improve receptivity. These are subtle shifts, but they make information easier to engage with and more likely to be retained.
Emotionally intelligent communication isn’t about persuasion. It’s about creating conditions for connection, so that understanding has a place to take root.
Conclusion: Leading With Emotional Intelligence in Health Communication
Health misinformation is not just a failure of logic. It’s a failure to recognise the emotional conditions under which people make decisions.
This paper has shown that emotional dysregulation, not a lack of reasoning ability, is a stronger predictor of belief in conspiracy theories. When people feel overwhelmed, uncertain or emotionally unanchored, they are more likely to seek out simplified narratives that offer relief, clarity or control, even if those narratives contradict credible information.
For health communicators, this changes the brief. The task is no longer just about correcting falsehoods or clarifying complex information. The mission is to design messages that are emotionally attuned, psychologically grounded and capable of reaching audiences who may be operating under very different emotional states.
This doesn’t mean compromising on accuracy. It means delivering information in ways that reduce resistance and create space for reflection. Emotional intelligence, once seen as a soft skill, is now central to effective communication in a complex and often polarised health landscape.
As this research suggests, emotionally intelligent messaging is not a tactical add-on. It is a foundational requirement. One that health brands, institutions and public bodies must build into their communication strategies if they are to earn trust and change behaviour.
Note: This white paper is adapted from postgraduate research completed in 2023 as part of my MSc in Psychology (2023). It has been reframed for health brand and communication leaders seeking evidence-based strategy in a noisy landscape.
References:
Ahmed, S. P., Bittencourt-Hewitt, A., & Sebastian, C. L. (2015). Neurocognitive bases of emotion regulation development in adolescence. Developmental Cognitive Neuroscience, 15, 11–25. https://doi.org/10.1016/j.dcn.2015.07.006
Aldao, A., Nolen-Hoeksema, S., & Schweizer, S. (2010). Emotion-regulation strategies across psychopathology: A meta-analytic review. Clinical Psychology Review, 30(2), 217–237. https://doi.org/10.1016/j.cpr.2009.11.004
Bardeen, J. R., Daniel, T. L., Hinnant, J. B., & Orcutt, H. K. (2017). Emotion dysregulation and threat-related attention bias variability. Motivation and Emotion, 41(3), 402–409. https://doi.org/10.1007/s11031-017-9604-z
Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26(6), 538–542. https://doi.org/10.1177/0963721417718261
Douglas, K. M., & Sutton, R. M. (2018). Why conspiracy theories matter: A social psychological analysis. European Review of Social Psychology, 29(1), 256–298. https://doi.org/10.1080/10463283.2018.1537428
Evans, J. St. B. T. (2008). Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology, 59, 255–278. https://doi.org/10.1146/annurev.psych.59.103006.093629
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2013). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131. https://doi.org/10.1177/1529100612451018
Molenda, Z., Green, R., Marchlewska, M., Cichocka, A., & Douglas, K. M. (2023). Emotion dysregulation and belief in conspiracy theories. Personality and Individual Differences, 204, 112042. https://doi.org/10.1016/j.paid.2022.112042
Van Prooijen, J. W., & Douglas, K. M. (2018). Belief in conspiracy theories: Basic principles of an emerging research domain. European Journal of Social Psychology, 48(7), 897–908. https://doi.org/10.1002/ejsp.2530
Victor, S. E., & Klonsky, E. D. (2016). Validation of a brief version of the Difficulties in Emotion Regulation Scale (DERS-18) in five samples. Journal of Psychopathology and Behavioral Assessment, 38(4), 582–589. https://doi.org/10.1007/s10862-016-9547-9