What is self-deceit?
Amongst philosophers and psychologists, self-deception is thought to be the motivated maintenance of a belief that is contradicted by evidence, often driven by an emotional need; that is, to protect self-esteem (Deweese-Boyd, 2006). In effect, individuals will “favor welcome over unwelcome information in a manner that reflects their goals or motivations” (Hippel and Trivers, 2011). A self-deceiver might avoid or distort critical information, much as a deceiver of others would, by biased information search, interpretation, or memory. So, is the process of deceiving oneself an intentional act, or does it operate beyond conscious awareness? To understand this, one must examine the underlying mechanisms of self-deceit across cognitive, emotional, and evolutionary domains. While the question “What is self-deceit?” invites a general definition, this essay focuses on exploring the underlying mechanism of self-deceit, arguing that it is primarily
an unconscious process shaped by motivation and cognitive forces.
Psychodynamic Factors
The psychodynamic perspective, rooted in Freudian theory, provides one of the earliest and most enduring accounts of self-deception. It conceptualizes self-deception not as a rational misjudgment or motivated bias, but as a deep, involuntary defense mechanism employed by the unconscious mind to protect the individual from emotional conflict, anxiety, or psychic pain. Unlike cognitive or behavioral models that emphasize conscious information processing, the psychodynamic view centers on the role of repressed desires, unconscious motives, and internal conflicts between different structures of the psyche. The psychoanalytic theory views self-deception as an ego defense mechanism deployed
unconsciously to diffuse conflict between unacceptable desires and the conscious self-image (the “superego”). First introduced by Sigmund Freud’s book The Ego and the Id (1923), his psychoanalytic theory states that the superego is the moral component of personality that internalizes the societal and parental standards of right and wrong, guiding behavior and contributing to feelings of guilt or pride. When faced with threatening truths about themselves, individuals may deny or rationalize them to alleviate fear and anxiety. In one case, a person who
believes “good people don’t have affairs” but cheats on a spouse might rationalize the behavior by blaming the spouse’s neglect, thus resolving the conflict between self-image and action. This rationalization process aligns with the defense mechanisms, such as repression, denial, rationalization, and projection, which are fundamental to this view of self-deception (A. Freud, 1936). Repression involves pushing unacceptable thoughts or impulses out of conscious awareness, while denial refuses to accept reality outright. Rationalization allows individuals to
construct false but comforting explanations for their actions, and projection attributes one’s own undesirable impulses to others. These mechanisms preserve a sense of psychological safety and continuity, often without the person being consciously aware of their operation.
Modern psychodynamic theorists have expanded on these ideas. George Vaillant (1992) categorized defense mechanisms by maturity level and noted that neurotic and immature defenses often involve significant distortion of reality. Self-deception is seen as one of the more complex forms of defense, reflecting the mind’s attempt to impose a balance between maintaining psychological stability and avoiding painful self-awareness. To resolve this discomfort, individuals often engage in unconscious mental restructuring—altering beliefs or rationalizing actions—to restore internal consistency. This restructuring is not a neutral or rational process; it is driven by a powerful motivation to maintain self-coherence and avoid emotional pain. Similarly, Kunda’s (1990) model of motivated
reasoning posits that individuals are more likely to arrive at conclusions that they want to be true, not because they deliberately choose to deceive themselves, but because motivation influences how information is processed. This includes selective memory retrieval, biased evaluation of evidence, and the strategic use of justifications that align with self-serving goals.
In sum, both classic and contemporary models suggest that the unconscious nature of self-deception is not a weakness but a necessary feature. It allows individuals to manage emotions, protect their identity, and maintain inner harmony. The very effectiveness of self-deception depends on the fact that the self does not recognize the deception at all.
Cognitive Factors
In psychology, cognitive dissonance is a mental phenomenon in which people
unknowingly hold fundamentally conflicting cognitions. It suggests that we have an “inner drive to hold all our attitudes and behavior in harmony to avoid disharmony” (McLeod, 2023). It is when individuals experience psychological discomfort, termed dissonance, when they hold two or more inconsistent cognitions. This discomfort acts as a motivational force, compelling one to
reduce the inconsistency through different “strategies”, including altering beliefs, justifying behaviors, or minimizing the importance of the conflict. In many cases, this process gives rise to unconscious self-deceptive thinking: individuals adjust or distort their beliefs to avoid the anxiety of internal contradiction without awareness that they are doing so. One of the most influential explanations for self-deceit lies in the cognitive dissonance theory.
Self-deception, in this context, is a motivated resolution of dissonance that preserves a coherent and positive self-image. When one’s actions clash with their moral standards or self-concept, self-deception allows the individual to maintain a sense of integrity without directly confronting their failings. Take, for example, a person who views themselves as someone who is honest yet steals something from the grocery store. This dissonance between self-perception and behavior is resolved through cognitive maneuvers. Such as trivializing the act, externalizing
responsibility, or redefining honesty in a self-serving way.
“It wasn’t a big deal….The store overcharges them anyway….I’m honest in important matters”.
In each case, the inconsistency is not eliminated through behavioral change or actual self-confrontation, but from distorted reasoning that defends the original self-concept.
Numerous studies have supported the link between dissonance reduction and biased cognition. One classic paradigm is the study conducted by Festinger and Carlsmith (1959) called the induced compliance experiment, in which participants performed a boring task and were then asked to lie to another participant by saying it was enjoyable. Those paid only $1 to lie experienced more dissonance than those paid $20, as the smaller reward was insufficient external justification for the lie. As a result, participants in the $1 condition convinced themselves that the task was enjoyable, demonstrating how belief change can serve to reduce internal conflict. This experiment illustrates the psychological mechanism underlying self-deception: when external justifications are weak, people alter internal beliefs to bring them in line with behavior, even if doing so conflicts with objective evidence.
Modern dissonance research extends this model to self-concept maintenance,
emphasizing the role of self-integrity and moral identity. According to the self-affirmation theory (Steele, 1988), individuals are motivated to see themselves as morally adequate and competent. When this self-image is threatened—say, by acting hypocritically or receiving critical feedback—dissonance arises. In such cases, people often deceive themselves to preserve self-worth. This may involve denying wrongdoing, attributing fault to others, or reinterpreting events to make them seem more consistent with personal values. Aronson (1992) further developed the idea of self-concept dissonance, arguing that the more central a belief is to one’s identity, the more intense the dissonance will be when that belief is challenged. Consequently, self-deception is most likely to occur when the stakes involve core beliefs about the self (e.g., morality, intelligence, competence).
These mechanisms are not merely theoretical. Empirical findings from social psychology reinforce how people routinely distort memory and reasoning to alleviate dissonance. For example, Stone et al. (1997) found that students who were reminded of their past failures in promoting safe sex (creating dissonance with their identity as responsible individuals) were more likely to change their attitudes or behavior when no external justification was available. Others chose to affirm unrelated aspects of the self to indirectly reduce discomfort, a process that may also fuel selective self-deception. In both cases, the drive to resolve internal tension shaped cognitive outcomes in a direction favorable to the self.
Cognitive psychology is the study of how the human brain works. Thus, it is closely tied to neuroscientific studies. Neuroscientific research tells us about the underpinnings of the process of self-deceit. For example, Functional MRI (fMRI) studies show that when individuals experience dissonance, such as during decision-making tasks that conflict with their values, there is increased activation in the anterior cingulate cortex (ACC) and dorsolateral prefrontal cortex (DLPFC) (van Veen et al., 2009). These brain regions are associated with conflict monitoring and cognitive control, respectively. After making a dissonant decision, individuals show decreased ACC activity, indicating that reduction of dissonance is accompanied by measurable changes in brain states, possibly reflecting self-deceptive rationalizations.
Notably, cognitive dissonance theory helps explain not only why people deceive
themselves but also how they manage to believe in false narratives despite available evidence. The process is often automatic, unconscious, and emotionally driven, making it difficult for individuals to detect their own bias. In this sense, self-deception is not always a deliberate lie to oneself, but rather a psychological necessity that allows for self-coherence in the face of contradiction. As Gilbert (1991) suggests, people often believe first and question later—when a belief is emotionally appealing or self-serving, it may become accepted automatically, unless compelling reason forces reevaluation.
At the same time, this model also reveals the fragility of self-deception. When individuals are forced to confront incontrovertible evidence, especially in social contexts where justification is required, self-deceptive beliefs may unravel. However, as Chance et al. (2015) demonstrated, self-deception can revive over time, as individuals forget or reinterpret events in ways that restore previous illusions. This suggests that cognitive dissonance reduction is not a one-time process but an ongoing dynamic, vulnerable to both internal motivations and external feedback.
In conclusion, cognitive dissonance provides an empirically supported explanation for self-deception. By emphasizing the psychological discomfort of inconsistency and the motivated efforts to reduce it, this framework reveals how individuals can sincerely come to believe falsehoods about themselves and their actions. Whether through biased memory, motivated reasoning, or belief revision, self-deception can preserve a stable and positive self-concept, though at the cost of objective truth.
Debate
Despite extensive study, self-deception remains conceptually debated. Philosophers and psychologists differ on what exactly counts as self-deception. Some argue the very idea is paradoxical: how can one simultaneously know and not know the same truth? The Stanford Encyclopedia (2023) notes that a minimal definition is controversial, but at least requires “maintain[ing] some false belief despite evidence to the contrary”. Questions persist about whether self-deception is intentional or unconscious, whether it involves genuine belief or a “quasi-belief,” and whether people are morally responsible for deceiving themselves.
One view holds that self-deception mirrors lying, as it requires a degree of awareness and intentional distortion. However, theories such as motivated reasoning proposed by Kunda suggest that people unconsciously manipulate how they attend to, interpret, and remember information in ways that align with their desires. This process allows individuals to protect their self-esteem without recognizing that any distortion is occurring. If one were fully aware of the deception, it would no longer serve its emotional purpose. Thus, the unconscious nature of self-deception is not only plausible—it is essential to its function.
Conclusion
Self-deception is a complex psychological process involving the distortion or denial of reality to protect self-esteem, reduce emotional distress, and maintain social advantages. Rooted in cognitive biases and motivated reasoning, self-deception allows individuals to unconsciously and involuntarily shape beliefs that align with their desires, typically without awareness of the distortion taking place. This automatic process shields individuals from distressing truths and sustains self-worth, precisely because the person is unaware of the manipulation. If people know that they are lying to themselves, deception will fail to protect their self-esteem.
Theories such as cognitive dissonance, psychoanalytic defense mechanisms, and
evolutionary perspectives all offer valuable insights into how and why self-deception occurs. While it can serve adaptive purposes, promoting resilience, optimism, and social confidence, it also carries risks, including impaired decision-making, interpersonal conflict, and resistance to self-awareness. As research progresses, a nuanced view of self-deception can inform interventions that balance self-protection with personal growth. Ultimately, self-deception endures because unconscious emotional and cognitive processes drive it, making it an automatic defense rather than a deliberate choice.
Bibliography
Aronson, Elliot. “The Return of the Repressed: Dissonance Theory Makes a Comeback.”
Psychological Inquiry 3, no. 4 (1992): 303–311.
https://doi.org/10.1207/s15327965pli0304_1.
Chance, Zoë, Francesca Gino, Michael I. Norton, and Dan Ariely. “The Slow Decay and Quick
Revival of Self-deception.” Dishonest Behavior: From Theory to Practice.
https://doi.org/10.3389/fpsyg.2015.01075.
Deweese-Boyd, Ian. “Self-deception.” Edited by Edward N. Zalta and Uri Nodelman.
Stanford Encyclopedia of Philosophy. Last modified September 21, 2023.
https://plato.stanford.edu/archives/fall2023/entries/self-deception/.
Dunning, David, Chip Heath, and Jerry M. Suls. “Flawed Self-Assessment: Implications for
Health, Education, and the Workplace.” Psychological Science in the Public Interest 5, no. 3
(2004): 69–106.
https://doi.org/10.1111/j.1529-1006.2004.00018.x.
The Editors of Encyclopaedia Britannica. “superego.” Encyclopedia Britannica, June 7, 2025.
https://www.britannica.com/science/superego.
Festinger, Leon, and James M. Carlsmith. “Cognitive Consequences of Forced Compliance.”
The Journal of Abnormal and Social Psychology 58, no. 2 (1959): 203–210.
https://doi.org/10.1037/h0041593.
Festinger, L. A Theory of Cognitive Dissonance. APA PsycNet.
https://psycnet.apa.org/record/1993-97948-000.
Gilbert, Daniel T. “How Mental Systems Believe.” American Psychologist 46, no. 2 (1991):
107–119.
https://doi.org/10.1037/0003-066X.46.2.107.
Kloppers, Mandy. “Why We Lie to Ourselves.” MentalHealth.com.
https://www.mentalhealth.com/library/why-we-lie-to-ourselves.
Kunda, Ziva. “The Case for Motivated Reasoning.” Psychological Bulletin 108, no. 3 (1990):
480–498.
https://doi.org/10.1037/0033-2909.108.3.480.
Mazar, Nina, On Amir, and Dan Ariely. “The Dishonesty of Honest People: A Theory of
Self-Concept Maintenance.” Journal of Marketing Research.
https://doi.org/10.1509/jmkr.45.6.633.
McLeod, Saul. “What Is Cognitive Dissonance Theory?” SimplyPsychology.
https://www.simplypsychology.org/cognitive-dissonance.html.
Sedikides, Constantine, and A. P. Gregg. “Self-Enhancement: Food for Thought.”
Perspectives on Psychological Science 3, no. 2 (2008): 102–116.
https://doi.org/10.1111/j.1745-6916.2008.00068.x.
Steele, Claude M. “The Psychology of Self-Affirmation: Sustaining the Integrity of the Self.”
Advances in Experimental Social Psychology.
https://doi.org/10.1016/S0065-2601(08)60229-4.
Stone, Jeff, Elliot Aronson, Abigail L. Crain, Maria P. Winslow, and Cathy B. Fried.
“Inducing Hypocrisy as a Means of Encouraging Young Adults to Use Condoms.”
Personality and Social Psychology Bulletin 20, no. 1 (1994): 116–128.
https://doi.org/10.1177/0146167294201012.
van Veen, Vincent, Michael K. Krug, Jonathan W. Schooler, and Cameron S. Carter.
“Neural Activity Predicts Attitude Change in Cognitive Dissonance.” Nature Neuroscience 12,
no. 11 (2009): 1469–1474.
https://doi.org/10.1038/nn.2413.
von Hippel, William, and Robert Trivers. “The Evolution and Psychology of Self-Deception.”
Behavioral and Brain Sciences 34, no. 1 (2011): 1–16.
https://doi.org/10.1017/S0140525X10001354.