Skip to main content

PERSPECTIVE article

Front. Psychol., 09 February 2016
Sec. Cognitive Science
This article is part of the Research Topic Deception, honesty, and cognitive load: Is lying always more effortful than truth telling? View all 6 articles

Self-Deception in Terminal Patients: Belief System at Stake

  • 1Mind-Brain Group, Institute for Culture and Society, University of Navarra, Pamplona, Spain
  • 2Unit of Medical Education and Bioethics, School of Medicine, University of Navarra, Pamplona, Spain
  • 3International Association of Catholic Bioethicists, Toronto, ON, Canada
  • 4Hospital Universitario de Burgos, Burgos, Spain

A substantial minority of patients with terminal illness hold unrealistically hopeful beliefs about the severity of their disease or the nature of its treatment, considering therapy as curative rather than palliative. We propose that this attitude may be understood as self-deception, following the current psychological theories about this topic. In this article we suggest that the reason these patients deceive themselves is to preserve their belief systems. According to some philosophical accounts, the human belief system (HBS) is constituted as a web with a few stable central nodes – deep-seated beliefs – intimately related with the self. We hypothesize that the mind may possess defensive mechanisms, mostly non-conscious, that reject certain sensory inputs (e.g., a fatal diagnosis) that may undermine deep-seated beliefs. This interpretation is in line with the theory of cognitive dissonance. Following this reasoning, we also propose that HBS-related self-deception would entail a lower cognitive load than that associated with confronting the truth: whereas the latter would engage a myriad of high cognitive functions to re-configure crucial aspects of the self, including the setting of plans, goals, or even a behavioral output, the former would be mostly non-conscious. Overall, we believe that our research supports the hypothesis that in cases of terminal illness, (self-)deceiving requires less effort than accepting the truth.

Introduction

A substantial minority of terminally ill patients hold the belief that either the severity of their disease or the nature of its treatment will lead to their recovery. This perspective article proposes that such cases can be considered as self-deception; furthermore, we propose that the unwillingness to accept the reality of their condition entails a lower cognitive load than accepting the truth, due to the activation of the human belief system (HBS) defensive mechanisms. We begin by considering the likelihood of self-deception as a special case of deception theories, and we discuss whether holding out false hope in this context may be contemplated as self-deception. Then, we explore current proposals of HBS function to explicate how the HBS may promote self-deception. Finally, we consider why its activation can lead to a lessening of cognitive load in life-threatening circumstances. In light of these considerations, we suggest that physicians should understand the values and beliefs of patients to opt for the best strategy: whether to promote personal autonomy or to allow self-deception.

Terminal Patients and Self-Deception

Most patients want to know the exact prognosis of their illness. However, in specific circumstances such as in the terminal stages of a cancer or in a severe degenerative disease, a substantial minority (between 15 and 25%) prefer not to be told (Schattner and Tal, 2002). In some cases, these preferences are usually accompanied by unrealistic but more hopeful beliefs. For example, nearly 70% of advanced oncological patients reported understanding that chemotherapy will likely cure their cancer, despite the physicians’ efforts to convey the significance of their illness and treatment (Weeks et al., 2012). These data empirically support what has already been long known, the existence of an emotional stage of denial (Kübler-Ross, 1969; Mackillop et al., 1988; Gattellari et al., 1999). We propose that this psychological attitude of terminal patients can be understood as self-deception, as we elaborate below.

Weeks et al. (2012) article, for example, reports that most oncological patients think that chemotherapy will cure their cancer despite physician indications to the contrary. While this may reflect simply miscommunication between doctor and patient (The et al., 2001; Lee Char et al., 2010) a substantial factor rests in the unwillingness to accept such news. In fact, Weeks et al. (2012) suggest that “patients perceive physicians as better communicators when they convey a more optimistic view of chemotherapy.” In other words, these patients may deceive themselves by biasing the source of information. Smith and Longo (2012) consider that the results of Weeks et al. (2012) may be due jointly to ineffective communication, together with self-deception. Following this line, Mitera et al. (2012) report that 15% of patients with advanced cancer believed radiotherapy will cure their disease, and 45% of them believed that the technique would prolong their lives, even after being included in an information program. In summary, these articles show that self-deception, although it may not be the only factor, contributes to the distorted view that terminal patients have about their condition.

von Hippel and Trivers (2011) propose that self-deception is an evolutionary resource to improve interpersonal deception. For this reason, both deception and self-deception are closely related and share some similar features. In the mainstream research on deception (see, for example, Bond and DePaulo, 2006), the deceiver is required to hold in mind two different representations of reality, truth and falsehood, either of which can be communicated. This view, however, has been challenged by McCornack’s Information Manipulation Theory (IMT) and IMT2, which proposes that a lie other than a “bald-faced lie” is possible, and less challenging, by altering the expectations of the listener in terms of quantity, quality, manner and relevance of the message (McCornack, 1992; McCornack et al., 2014), without the need to keep two accounts in mind. The classical account of self-deception admits of a similar dichotomous paradigm: the truth is stored in the unconscious whereas the false discourse is consciously available (Gur and Sackeim, 1979). Like McCornack (1992), von Hippel and Trivers (2011) challenge the side by side information view and propose other mechanisms of self-deception, including bias-information search strategies, biased interpretive processes, or biased memory processes. Another classical feature of (self-)deception that has been challenged by these authors is volition. In short, following the mainstream lines of research, we could say that deception is produced at two different times: an a priori intention-to-deceive event and an a posteriori act-of-deception one. Therefore, the deceiver’s will to deceive is a necessary condition. McCornack (1992) argues that volition is substituted as the primary causal antecedent of deception by information: “When people possess information that they deem too problematic to disclose, they will deceive”. Moreover, von Hippel and Trivers (2011) defend that self-deception should exist at both conscious and unconscious levels when individuals deceive themselves; they suggest that information bias strategies are not necessarily intentional. In our opinion, these characteristics of (self-)deception support the understanding of terminal patients’ high expectations as such. Paraphrasing the previous sentence by McCornack et al. (2014), when people possess information that they deem too problematic to disclose to themselves, they will deceive themselves. Moreover, it is plausible that terminal patients adopt biased information search strategies, in which information may even be consciously suppressed (visiting doctors who look more optimistic about their prognosis), biased interpretive processes (convincing themselves that chemotherapy will cure their disease), or biased memory processes (misremembering facts that the doctor did not say).

Alternatively, one could argue that self-deception is not comparable with deception because it lacks a behavioral output. For example, according to Levine’s (2014) Truth-Default Theory, deception may include self-deception so long as the message has a deceptive purpose, even if it is unconscious. In the present article, we do not discuss whether self-deception always ends up in interpersonal deception, although it is unquestionable that sometimes it is so. In any case, we believe that the three strategies described by von Hippel and Trivers (2011) suggest that such behavioral output (i.e., telling a lie) is not necessary for self-deception in some cases. Biased strategies may be understood as mental acts (Anscombe, 1957). For example, according to Mack et al. (2007), some parents of children with cancer are unable to assimilate specifically the relevant information about the fatal prognosis, a fact that may be related with volition and the unconscious mind. Although it may sound provocative, such psychological reactions could be labeled as non-conscious self-deceptions. Other authors have proposed that a behavioral output is not a must of self-deception. In a recent empirical study, Chance et al. (2015) state that self-deception can allow people to hold preferred beliefs, regardless of the truth. They exemplify self-deception citing Mele (2001): “stock examples of self-deception, both in popular thought and in the literature, feature people who falsely believe – in the face of strong evidence to the contrary – that their spouses are not having affairs (…) or that they themselves are not seriously ill”.

The Role of the HBS in Human Identity

What could be the goal of denying the imminence of one’s own death? One of the common answers that we can find in the recent medical literature is keeping hope alive (Trope and Neter, 1994; Deschepper et al., 2008; Pergert and Lutzen, 2012). Recent research has demonstrated the psychological benefit of self-deception at least in the short-term (Chance et al., 2011, 2015). “Hope is a good breakfast”, wrote Bacon (2010), “but it is a bad supper”. In this context, Weeks et al. (1998) show that those cancer patients with false optimistic beliefs about their prognosis tended to choose life-extending therapy over comfort care, even though the former was aggressive. In the following paragraphs, we will discuss the biological value of hope in relation with the maintaining of the self.

Hope is built on confident expectations and, as the psychiatrist Adler (1931) studied in depth, goal-seeking is central for the developing and maintaining of the self. Hope, such as happy memories, gives human experiences a particular temporal connection –a web where the moral agent is consolidated (Treisman and McHugh, 2011). Similarly, Dennett (1991) has defined the self as a center of narrative gravity, a powerful fiction where efficient ties of coherence among beliefs, goals, and behavior are provided. We are a crossroad between past and future, and without future (or with negative expectancies) the human self is severely undermined (Dennett, 1991). Therefore, anticipating the future is critical for adaptive behavior but, more importantly, for keeping mental balance by defending the self. Hence, we could assume the existence of some kind of psychological homeostatic regulation that preserves the logical connections among beliefs and its subsequent benefit for the person, which may be defined as the clearest representation of human identity. Work of several researchers (Wilson, 1998; Damasio, 2010; Carroll, 2012) assumes the existence of neuropsychological mechanisms for constituting the conditions under which the HBS could be implemented and maximized, including defense systems against potential informational threats. In this context, the preservation of mind is subsumed beneath a cluster of homeostatic mechanisms that embrace informational as well as biological integrity. The relationship between self-deception and the HBS can be also understood under the umbrella of a different – although complementary – paradigm: cognitive dissonance. According to Festinger (1957), the inconsistency between a sensory input and (for example) a deep-seated belief is psychologically uncomfortable for the agent, who tends to reduce that dissonance. To do so, they may avoid situations and incoming information. We note here similarities with von Hippel and Trivers (2011) theory of self-deception, and in particular the bias in selecting the incoming information. Our view of deceiving oneself to protect the central nodes of the HBS fits well with Festinger’s (1957) theory. Although a thorough analysis of the similarities and differences between his theory and our proposal is beyond the scope of this article, we would like to mention that other authors have related cognitive dissonance and information-avoidance in serious diseases such as cancer (Case et al., 2005).

The relationship between the distorted high expectancies of terminal patients, the preservation of the self and the HBS may be better understood by commenting on two particular characteristics of the maintenance and defense mechanisms of the HBS: limited flexibility and goal-oriented inertia. Concerning the first, beliefs are linked to one another in a more or less strongly coherent way. As Davidson (1980) wrote, “there is no assigning beliefs to a person one by one”. Instead of considering beliefs as isolated, it could be more adequate to talk about web of beliefs or belief system, since they make sense only in relation with each other. However, the network has to be sufficiently adaptable to allow changes and even some degree of contradiction among beliefs. Considering neural plasticity and following the web metaphor, Quine and Ullian (1970) propose a spider model in which HBS would have only a few nuclear nodes, formed by rigid deep-seated beliefs, and multiple peripheral areas that may vary with respect to the nuclear beliefs. Conflicts in the latter would be more innocuous, and thus better tolerated than nuclear tensions, which would compromise the stability of the entire web and therefore the agent’s psychical condition. Concerning the goal-oriented inertia, as Korsgaard (2009) has pointed out, the reason to choose a belief is different from the reason to make choices. We choose beliefs because they make up what we are: it is thus a psychological need, which constitutes the agent’s identity and builds the subject’s unity.

Cognitive Load in HBS-Related Self-Deceptions

The abovementioned HBS traits suggest why bad news about prognosis implies fatal consequences for the self: first, because it makes a significant dent on deep-seated beliefs; and second because it often destroys the agent’s goals and plans, paralyzing thus present actions. Coherently, self-deceptions about death should be classified as serious deceptions, that is, psychological strategies to avoid “social context in which sharing the truth [including with oneself] might prove very costly to individuals in not meeting their goals” (Walczyk et al., 2014; our text between square brackets). This perspective may help us clarify the current controversy about the cognitive load of deceptions. At least in relation to HBS-related serious self-deceptions, we defend that they would require a lower cognitive load in different ways, as we will see in the following paragraphs.

First, from a functional point of view, changing one’s own deep-seated beliefs – facing reality – would imply launching multiple abilities in order to reconfigure a large part of the whole web of beliefs. Besides, it is reasonable to assume that, in these multitasking processes, many brain areas are involved: this would involve a higher metabolic rate, as it is suggested by the correlation between glucose metabolism and a successful performance in executive functions (Karlamangla et al., 2014). In contrast, HBS-related self-deceptions would have a lower cognitive load, since the self-deceiver does not need a previous construction of realistic representations, a preliminary plan to deceive, or a motor behavior to implement it.

Second, the relatively low cognitive demand of HBS-related self-deceptions may be also justified from a connectionist approach, i.e., considering the flow of information through networks, and in particular parallel distributed processing (PDP). Unlike modular processing (vertical, localized, and domain-specific), PDP has cross content domains and is not carried out in a step-by-step procedure in which representations are informationally encapsulated (Bechtel and Abrahamsen, 1991). This cognitive resource is also proposed by McCornack et al. (2014) in the central premise of their IMT2 as the main characteristic of the speech production system that leads to deceptive or truthful discourses. The key issue here is that, assuming the HBS as a web, the modular brain processes and IMT2 premises, cognitive effort in PDP is shared between a huge number of nodes that are activated or inhibited simultaneously, entailing thus higher speed in the management of inputs and greater energy efficiency.

Third, the application of the connectionist approach to the HBS is also useful to understand the possibility of non-conscious HBS-related self-deception. In fact, the common understanding of the HBS assumes its non-conscious nature (Davidson, 2001). Indeed, the web of beliefs is never entirely conscious at any given moment, that is, HBS works mainly in non-conscious levels (Desender and Van den Bussche, 2012). Once again, this view is supported by IMT2: “because most of this processing and behavioral production occurs at the unconscious level, it may very well be the case that so-called ‘decisions’ about deception actually are made prior to conscious awareness”. Then, a new question arises: are non-conscious self-deceptions less cognitive demanding than conscious processes? We strongly believe that a conscious acceptance of a sensory input that threatens one’s own HBS involves weakening the self, which leads, in turn, to the necessity of struggling against fear, stress and anxiety (Thorson and Powell, 1988; DePaulo et al., 2004). Therefore, following the non-conscious self-deception seems cognitively lighter than accepting the threatening truth.

We would like to mention as well two possible limitations of our proposal: first, most self-deceptions – like deceptions in general – involve handling truthful information, in order not only to generate fictional but plausible narratives, and also because they may be generated without prior warning during a honest thinking process (McCornack, 1997). This means that the possibility of measuring the cognitive load of an isolated self-deception is questionable. Second, as we have discussed, cognitive load of self-deception is very low when this is non-conscious or involve only mental acts. However, self-deception may also include other processes such as a behavioral output – telling others – or the pursuit of a conscious goal. Hence, it is reasonable to think that cognitive load would increase in these more complex self-deceptions.

Finally, in order to evaluate the cognitive load involved in HBS-related self-deceptions, it is important to bear in mind the cultural environment of the agent. Stanley et al. (2008) have demonstrated multiple attitudes that are implicitly assimilated, i.e., without conscious intervention, as a consequence of social influences, and are correlated with rearrangements of neural structure. Death plays a fundamental role in several cultures across history to the extent that, within them, all stages of human life, to a greater or lesser degree, drawn their significance from such final event. One of the possible attitudes toward this event is preparation (Palmer, 1996), a process akin to cognitive reappraisal that is known to modulate brain activity associated with emotional responding (Farb et al., 2010). Thus, in such social environments, news about one’s own death would be better assimilated, which means that self-deception would evoke greater load than facing reality. However, according to Ariès (1982), death is being forbidden in Western postmodern society. If he is right, self-deceptions in terminal patients may be the outcome of a lengthy process of attitudinal assimilation of cultural predilections. It would be the easy and quickest way of surviving, at least in the short term.

Conclusion

Truth-telling in healthcare is a generally adopted ethic in Western countries, even if it entails ending patients’ illusions in cases of serious illness. Our conclusions here suggest that if deep beliefs are critical for self-subsistence, then some information may be more harmful than the adverse consequences of any deception, especially when individuals do not have the strength to reconfigure a new identity. What is the biggest concern about death for terminal patients, the thought of death itself – Heidegger’s “being-for-death” – or the thought of living an unfamiliar world – transforming the way of “being-in-the-world”? We think that the second one is a more exhausting challenge, although some patients may prefer facing reality at the risk of oedipal madness. Although telling the truth to patients may mean to respect their autonomy, serious self-deceptions are not always conscious and voluntary. Thus, physicians could make patients aware of these defensive mechanisms that lead to self-deception, unless it incapacitates patients to make decisions. This poses a dilemma: should the doctor insist on the patient’s understanding of the truth, or allow self-deception? This question is at the very base of our perspective, since not everybody accept a narrative view of themselves: some people are naturally disposed to conceive their life in a more fragmented way, without giving such importance to their past and future (Strawson, 2015). Although we do not agree with Strawson’s (2015) radical non-narrative interpretation of the self, we accept that the necessity of keeping a narrative self may be variable among people. For that reason, physicians should make an effort to understand values and beliefs of their patients, and make the appropriate decision in each case.

Author Contributions

All authors listed have made substantial, direct and intellectual contribution to the work, and approved it for publication.

Funding

Our research is supported by Obra Social La Caixa and Institute for Culture and Society (ICS).

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The reviewer, Steven Allen Mccornack, and handling Editor declared a current collaboration and the handling Editor states that the process nevertheless met the standards of a fair and objective review.

Acknowledgment

The authors appreciate the suggestions of the Mind-Brain Group members in the preparation of this manuscript.

References

Adler, A. (1931). What Life Could Mean to You. Oxford: Oneworld Publications.

Google Scholar

Anscombe, G. (1957). Intention. Cambridge, MA: Harvard University Press.

Google Scholar

Ariès, P. (1982). The Hour of Our Death. New York, NY: Vintage Books.

Google Scholar

Bacon, F. (2010). A Collection of Apophthegms New and Old. Apophthegm No. 36. Whitefish, MT: Kessinger Publisher.

Bechtel, W., and Abrahamsen, A. (1991). Connectionism and the Mind: An Introduction to Parallel Processing in Networks. Cambridge, MA: Basil Blackwell.

Google Scholar

Bond, C. F., and DePaulo, B. M. (2006). Accuracy of deception judgments. Pers. Soc. Psychol. Rev. 10, 214–234. doi: 10.1207/s15327957pspr1003_2

PubMed Abstract | CrossRef Full Text | Google Scholar

Carroll, J. (2012). The truth about fiction: biological reality and imaginary lives. Style 46, 129–160.

Google Scholar

Case, D. O., Andrews, J. E., Johnson, J. D., and Allard, S. L. (2005). Avoiding versus seeking: the relationship of information seeking to avoidance, blunting, coping, dissonance, and related concepts. J. Med. Libr. Assoc. 93, 353–362.

PubMed Abstract | Google Scholar

Chance, Z., Gino, F., Norton, M. I., and Ariely, D. (2015). The slow decay and quick revival of self-deception. Front. Psychol. 6:1075. doi: 10.3389/fpsyg.2015.01075

PubMed Abstract | CrossRef Full Text | Google Scholar

Chance, Z., Norton, M. I., Gino, F., and Ariely, D. (2011). Temporal view of the costs and benefits of self-deception. Proc. Natl. Acad. Sci. U.S.A. 108, 15655–15659. doi: 10.1073/pnas.1010658108

PubMed Abstract | CrossRef Full Text | Google Scholar

Damasio, A. (2010). Self Comes to Mind: Constructing the Conscious Mind. New York, NY: Pantheon.

Google Scholar

Davidson, D. (1980). Essays on Actions and Events. Oxford: Oxford University Press.

Google Scholar

Davidson, D. (2001). Subjective, Intersubjective, Objective. Oxford: Oxford University Press.

Google Scholar

Dennett, D. (1991). Consciousness Explained. Boston, MA: Little, Brown & Co.

Google Scholar

DePaulo, B., Ansfield, M., Kirkendol, S., and Boden, J. (2004). Serious lies. Basic Appl. Soc. Psychol. 26, 147–167.

Google Scholar

Deschepper, R., Bernheim, J. L., Stichele, R., Vander Van den Block, L., Michiels, E., Van Der Kelen, G., et al. (2008). Truth-telling at the end of life: a pilot study on the perspective of patients and professional caregivers. Patient Educ. Couns. 71, 52–56. doi: 10.1016/j.pec.2007.11.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Desender, K., and Van den Bussche, E. (2012). Is consciousness necessary for conflict adaptation? A state of the art. Front. Hum. Neurosci. 6:3. doi: 10.3389/fnhum.2012.00003

PubMed Abstract | CrossRef Full Text | Google Scholar

Farb, N., Anderson, A., Bean, J., McKeon, D., Mayberg, H., and Segal, Z. (2010). Minding one’s emotions: mindfulness training alters the neural expression of sadness. Emotion 10, 25–33. doi: 10.1037/a0017151

PubMed Abstract | CrossRef Full Text | Google Scholar

Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford: Stanford University Press.

Google Scholar

Gattellari, M., Butow, P. N., Tattersall, M. H. N., Dunn, S. M., and MacLeod, C. A. (1999). Misunderstanding in cancer patients: why shoot the messenger? Ann. Oncol. 10, 39–46. doi: 10.1023/A:1008336415362

PubMed Abstract | CrossRef Full Text | Google Scholar

Gur, R. C., and Sackeim, H. A. (1979). Self-deception: a concept in search of a phenomenon. J. Pers. Soc. Psychol. 37, 147–169. doi: 10.1037/0022-3514.37.2.147

CrossRef Full Text | Google Scholar

Karlamangla, A. S., Miller-Martinez, D., Lachman, M. E., Tun, P. A., Koretz, B. K., and Seeman, T. E. (2014). Biological correlates of adult cognition: midlife in the united states (MIDUS). Neurobiol. Aging 35, 387–394. doi: 10.1016/j.neurobiolaging.2013.07.028

PubMed Abstract | CrossRef Full Text | Google Scholar

Korsgaard, C. (2009). Self-Constitution. Agency, Identity, and Integrity. New York, NY: Oxford University Press.

Google Scholar

Kübler-Ross, E. (1969). On Death and Dying. New York, NY: The MacMillan Company.

Google Scholar

Lee Char, S. J., Evans, L. R., Malvar, G. L., and White, D. B. (2010). A randomized trial of two methods to disclose prognosis to surrogate decision makers in intensive care units. Am. J. Respir. Crit. Care Med. 182, 905–909. doi: 10.1164/rccm.201002-0262OC

PubMed Abstract | CrossRef Full Text | Google Scholar

Levine, T. R. (2014). Truth-default theory (TDT): a theory of human deception and deception detection. J. Lang. Soc. Psychol. 33, 378–392. doi: 10.1177/0261927X14535916

CrossRef Full Text | Google Scholar

Mack, J. W., Cook, E. F., Wolfe, J., Grier, H. E., Cleary, P. D., and Weeks, J. C. (2007). Understanding of prognosis among parents of children with cancer: parental optimism and the parent-physician interaction. J. Clin. Oncol. 25, 1357–1362. doi: 10.1200/JCO.2006.08.3170

PubMed Abstract | CrossRef Full Text | Google Scholar

Mackillop, W. J., Stewart, W. E., Ginsburg, A. D., and Stewart, S. S. (1988). Cancer patients’ perceptions of their disease and its treatment. Br. J. Cancer 58, 355–358.

Google Scholar

McCornack, S. A. (1992). Information manipulation theory. Commun. Monogr. 59, 1–16. doi: 10.1080/03637759209376245

CrossRef Full Text | Google Scholar

McCornack, S. A. (1997). “The generation of deceptive messages: laying the groundwork for a variable theory of interpersonal deception,” in Message Production: Advances in Communication Theories, ed. J. Greene (Mahwah, NJ: Lawrence Erlbaum Associates).

McCornack, S. A., Morrison, K., Paik, J. E., Wisner, A. M., and Zhu, X. (2014). Information manipulation theory 2: a propositional theory of deceptive discourse production. J. Lang. Soc. Psychol. 33, 348–377. doi: 10.1177/0261927X14534656

CrossRef Full Text | Google Scholar

Mele, A. (2001). Self-deception Unmasked. Princeton, NJ: Princeton University Press.

Google Scholar

Mitera, G., Zhang, L., Sahgal, A., Barnes, E., Tsao, M., Danjoux, C., et al. (2012). A survey of expectations and understanding of palliative radiotherapy from patients with advanced cancer. Clin. Oncol. 24, 134–138. doi: 10.1016/j.clon.2011.09.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Palmer, L. (1996). “Framing death: cultural and religious responses,” in Facing Death, eds H. Spiro, L. P. Wandel, and M. G. McCrea Curnen (New Haven, NJ: Yale University Press), 111–113.

Pergert, P., and Lutzen, K. (2012). Balancing truth-telling in the preservation of hope: a relational ethics approach. Nurs. Ethics 19, 21–29. doi: 10.1177/0969733011418551

PubMed Abstract | CrossRef Full Text | Google Scholar

Quine, W., and Ullian, J. (1970). The Web of Belief. New York, NY: Random House.

Google Scholar

Schattner, A., and Tal, M. (2002). Truth telling and patient autonomy: the patient’s point of view. Am. J. Med. 113, 66–69. doi: 10.1016/S0002-9343(02)01127-0

CrossRef Full Text | Google Scholar

Smith, T., and Longo, D. (2012). Talking with patients about dying. N. Engl. J. Med. 367, 1651–1652. doi: 10.1056/NEJMe1211160

PubMed Abstract | CrossRef Full Text | Google Scholar

Stanley, D., Phelps, E., and Banaji, M. (2008). The neural basis of implicit attitudes. Curr. Dir. Psychol. Sci. 17, 164–170. doi: 10.1111/j.1467-8721.2008.00568.x

CrossRef Full Text | Google Scholar

Strawson, G. (2015). “The unstoried life,” in On Life-Writing, ed. Z. Leader (Oxford: Oxford University Press).

The, A., Hak, T., Koëter, G., and van der Wal, G. (2001). Collusion in doctor-patient communication about imminent death: an ethnographic study. West. J. Med. 174, 247–253. doi: 10.1136/ewjm.174.4.247

CrossRef Full Text | Google Scholar

Thorson, J., and Powell, F. (1988). Elements of death anxiety and meanings of death. J. Clin. Psychol. 44, 691–701. doi: 10.1002/1097-4679(198809)44:5<691::AID-JCLP2270440505>3.0.CO;2-D

CrossRef Full Text | Google Scholar

Treisman, G., and McHugh, P. (2011). “Life story as the focus of psychotherapy: the Johns Hopkins conceptual and didactic perspectives,” in The Psychotherapy of Hope, eds R. D. Alarcón and J. B. Frank (Baltimore, MD: The Johns Hopkins University Press).

Trope, Y., and Neter, E. (1994). Reconciling competing motives in self-evaluation: the role of self-control in feedback seeking. J. Pers. Soc. Psychol. 66, 646–657. doi: 10.1037/0022-3514.66.4.646

PubMed Abstract | CrossRef Full Text | Google Scholar

von Hippel, W., and Trivers, R. (2011). The evolution and psychology of self-deception. Behav. Brain Sci 34, 1–16. doi: 10.1017/S0140525X10001354

PubMed Abstract | CrossRef Full Text | Google Scholar

Walczyk, J., Harris, L., Duck, T., and Mulay, D. (2014). A social-cognitive framework for understanding serious lies: activation-decision-construction-action theory. New Ideas Psychol. 34, 22–36. doi: 10.1016/j.newideapsych.2014.03.001

CrossRef Full Text | Google Scholar

Weeks, J., Catalano, P., Cronin, A., Finkelman, M., Mack, J., Keating, N., et al. (2012). Patients’ expectations about effects of chemotherapy for advanced cancer. N. Engl. J. Med. 367, 1616–1625. doi: 10.1056/NEJMoa1204410

PubMed Abstract | CrossRef Full Text | Google Scholar

Weeks, J. C., Cook, E. F., O’Day, S. J., Peterson, L. M., Wenger, N., Reding, D., et al. (1998). Relationship between cancer patients’ predictions of prognosis and their treatment preferences. JAMA 279, 1709–1714. doi: 10.1001/jama.279.21.1709

CrossRef Full Text | Google Scholar

Wilson, E. (1998). Consilience: The Unity of Knowledge. New York, NY: Alfred A. Knof.

Google Scholar

Keywords: deception, cognitive dissonance, cognitive load, personal identity, self

Citation: Echarte LE, Bernacer J, Larrivee D, Oron JV and Grijalba-Uche M (2016) Self-Deception in Terminal Patients: Belief System at Stake. Front. Psychol. 7:117. doi: 10.3389/fpsyg.2016.00117

Received: 20 July 2015; Accepted: 21 January 2016;
Published: 09 February 2016.

Edited by:

Jeffrey John Walczyk, Louisiana Tech University, USA

Reviewed by:

Laura Visu-Petra, Babes-Bolyai University, Romania
Steven Allen Mccornack, The University of Alabama at Birmingham, USA

Copyright © 2016 Echarte, Bernacer, Larrivee, Oron and Grijalba-Uche. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Javier Bernacer, jbernacer@unav.es

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.