Common Cognitive Biases in Caring for Patients

Man in blue uniform listening to another person.
U.S. Navy Photo by Mass Communication Specialist 1st Class Ernest R. Scott
By Debbie DiazGranados, Ph.D.
November 27, 2017

I’m the best driver I know.

There I said it.

It’s obvious. Look at those drivers driving too fast (or too slow), riding uncomfortably close to my bumper, weaving through traffic, not stopping for three seconds at a stop sign, no turn signal, breaking too often or too hard, running a red light, drifting across the lane line – the list could go on and on.

Sound familiar?

Why do I (we) think this? And how is this related to patient care? The short answer – bias.

Humans are hardwired to seek out the simplest explanation for things. Typically this preserves valuable cognitive resources, but these cognitive shortcuts can be misleading – resulting in biases and ultimately ill-informed decisions. Driving is an example that most people can relate to.

Health care workers are equally susceptible. The difference is that in health care decision-making heuristics can lead to errors in patient care. A quick Google search reveals that there is somewhere in the neighborhood of 160 identified biases that we can be prone to.

I cannot cover them all here, but three in particular – confirmation bias, fundamental attribution error, and framing – provide good examples for how bias pervades patient care.

Confirmation Bias

Unconsciously everybody likes to be right.

Confirmation bias is an example of this. People tend to seek out and interpret information in a way that confirms their preexisting beliefs. In health care this can lead to harmful outcomes. Antecedent thoughts about a patient’s condition can lead to misdiagnosis and potentially incorrect treatment approaches. There are a number of biases that situationally contribute to confirmation bias.

What does this look like? An infantry non-commissioned officer (NCO) with three prior combat deployments reports experiencing several symptoms, emphasizing frequent headaches. The health care worker places additional importance on headaches searching for a suitable diagnosis (aka, anchoring effect – overreliance on one piece of information). The health care worker determines that the NCO has a history of mild traumatic brain injury (TBI) in high school and from a blast on a deployment. History of TBI is one of the most likely causes based on the combination of symptoms (aka, search satisficing – stop searching for alternate explanations once a logical one has been found). A diagnosis is entered into the electronic medical record. Now every time someone new views this patient’s medical notes, post-concussive headaches is their starting point –unintentionally reducing the chance that an alternative diagnosis will emerge (aka, diagnostic momentum – accepting a previous diagnosis without sufficient skepticism). Emerging information that contradicts the original diagnosis is disregarded as being an unlikely explanation for their condition (aka, Semmelweis reflex – rejection of new evidence that contradicts current diagnosis). The NCO’s undiscovered neurologic condition goes undiagnosed for another two years.

In that example, there are four examples of bias that occur at different phases of the diagnosis process. The result is a diagnosis that is predicated on pre-existing beliefs about the patient’s condition – or confirmation bias.

Fundamental Attribution Error

Context is not as obvious as we think.

The fundamental attribution error is the tendency to overrate internal motivations (i.e., personality), and underrate external situational factors when judging others’ behavior. Read the following two sentences:

The sailor arrived to the facility visibly out of breath, and complained about not sleeping well the previous night. Based on further testing, it was determined that the sailor was at an increased risk of a serious health condition.   

Now re-read that statement and put overweight before the word sailor in each sentence. Try again with middle-aged. Young. Agitated. Boisterous. Fit. On Limited Duty. Enlisted. Officer. You can insert virtually any adjective and your point of reference for that sailor probably changes.

Taking into account these pieces of information is not always bad, sometimes they provide clues to the patient’s problem. It gets harmful when you start assigning personality traits to these adjectives. An overweight sailor may be subconsciously categorized as lazy, when the circumstances of his or her weight gain may be a result of situational factors like difficulty understanding portion sizes, financial concerns that lead them to choose cheaper less healthful foods, or thyroid dysfunction. That is, the error is not just in the attribution, but also our failure to consider the modifiable factors that underlie the source of the attribution.

In health care, painting patient behavior in broad character judgment strokes can influence not just patient/care provider interactions, but also the quality of care given.


It is all about how you word it.

The way that a choice is presented has a major impact on decision-making; this is known as the framing effect. A health care worker’s day is full of decisions that involve a patient’s general well being, sometimes with major life changing implications. Most of these decisions are framed in terms of the prospect of loss or gain. Positive or negative. A form of cognitive risk management.

Consider the following example:

A new behavioral health program has been introduced for treating posttraumatic stress disorder (PTSD). The program is voluntary and the decision to implement is made at the provider level. At medical treatment facility (MTF) A the treatment is described as having a 67 percent chance of success. At MTF B the treatment is described as having a 33 percent chance of failure.

At MTF A there is high rate of adoption at the care provider level. On the other hand, at MTF B there is a low rate of adoption at the care provider level. Why is that? The program is the same. The probabilities are the same. The choice is the same. What is different? The way risk is framed.

Numerous studies in health care have demonstrated that framing a choice positively (e.g., survival) generates more risk taking decision-making. Whereas framing it negatively (e.g., mortality) generates more conservative decision-making. This is a simplified example, however it demonstrates the power that rewording the same choice can have.

When presented with a decision, to overcome the framing effect, it is important to think about the decision from as many perspectives as possible. This can be done by asking questions and engaging in mental reframing – to provide a more complete basis for making decisions.           


My husband rolls his eyes every time I claim roadway superiority. It doesn’t matter how many traffic violations or how many fender benders I’ve been a part of – in my mind I’m a great driver.

The superiority illusion is just another example of how bias pervades our lives. Based on what I know, it’s hard to prevent bias. However, by being aware of your biases you can better recognize when they’re influencing your decisions and the decisions of others around you.

But maybe I’m just biased.

Dr. Debbie DiazGranados is a contracted subject matter expert in industrial/organizational psychology at the Psychological Health Center of Excellence. She has a doctorate in industrial/organizational psychology and has studied teamwork in health care for more than 10 years.

The views expressed in Clinician's Corner blogs are solely those of the author and do not necessarily reflect the opinion of the Psychological Health Center of Excellence or Department of Defense.


  • Excellent, thought provoking commentary. Eye opening. Thanks for posting.

  • Great article. I remember working in the mid 90s as a war trauma mental health therapist in the VAs Vet Center and seeing WWII vets who had been diagnosed years ago with psychosis like schizophrenia and medicated for years with anti-psychotic drugs. All the practitioners in the VA over the years would read the chart when they got the case and assume without intellectual, compassionate curiosity. After doing extensive interviews with these vets and their significant others, I found that they had PTSD and there initial activation of symptoms were classic flash backs triggered by sights and sounds in the workplace. Was a lot of work to get the VA to adjudicate the PTSD and give the vets service connected disability. These poor guys were in their 80s and it was a bit late and their lives were ruined by these clinical chart assumptions. A couple of them died knowing that they were not crazy and their significant others were very thankful that they were finally heard by someone.

Add new comment

PHCoE welcomes your comments.

Please do not include personally identifiable information, such as Social Security numbers, phone numbers, addresses, or e-mail addresses in the body of your comment. Comments that include profanity, personal attacks, or any other material deemed inappropriate by site administrators will be removed. Your comments should be in accordance with our full comment policy regulations. Your participation indicates acceptance of these terms.

Please read our full Comment Policy.

You must have Javascript enabled to use this form.
You must have Javascript enabled to use this form.