Psychologists have studied the judgment of experts since the Tenerife airport disaster. On March 27, 1977, KLM’s top Boeing 747 pilot started to take off on a foggy runway in the Canary Islands. The flight engineer, the lowest-ranking officer in the cockpit and not a qualified pilot, told the pilot that a Pan Am 747 was still taxiing on the runway. The pilot dismissed the flight engineer’s objection. 47 seconds later the KLM 747 crashed into the Pan Am 747, killing 583 people in the worst accident in aviation history.

Psychologists wondered why the flight engineer, who was not an expert, understood the situation, when the the pilot and co-pilot, who were experts, made a wrong decision. Psychologists have identified cognitive biases that experts are susceptible to, that non-experts are less likely to have.

The bandwagon effect is also called groupthink.

Groupthink is a psychological phenomenon that occurs within a group of people, in which the desire for harmony or conformity in the group results in an incorrect or deviant decision-making outcome. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative ideas or viewpoints, and by isolating themselves from outside influences.

For example, the consensus among stuttering experts is that vitamin B-1 is ineffective. This conclusion wasn’t reached by reading studies, but rather because[ref]Nan Bernstein Ratner. 2013. RE:Vitamin B1 and stammering. May 9, 2013, ASHA SIG 04 e-mail list.[/ref]

If something like vitamins reliably helped people who stutter, why would we have not capitalized on this for over 50 years?

The same expert who wrote that said, a few years earlier,[ref]Bernstein Ratner, N. 2012. “Evidence-Based Practice and Practice-Based Evidence: Closing the Gap.” Stuttering Foundation DVD No. 6720[/ref]

What [speech-language pathologists] do when they don’t have an answer to a clinical problem…The first thing people do is not head to the literature…[instead]…People call friends, and they look at old textbooks, which were out of date the first day that they came out, and basically they don’t use the literature. And in many respects it’s also because they’re not sure how to get to the good literature…. In our field and other fields we have an “evidence chasm.” There’s evidence but it’s not clear how it’s getting out to people.

Psychologist Gordon Rugg believes that medical fields typically focus research in a few areas while ignoring other areas, and the ignored areas are where advances in the field are most likely to occur, often with minimal effort or expense. His verifier method identifies areas where such “low-hanging fruit” is ready to be plucked.

Other cognitive biases common among experts include:[ref]Cognitive bias, Wikipedia, http://en.wikipedia.org/wiki/Cognitive_bias, accessed 2013 May 10.[/ref]

  • Confirmation bias “is the tendency to search for or interpret information in a way that confirms one’s preconceptions. In addition, individuals may discredit information that does not support their views.” I’ve seen this cognitive bias in almost every Ph.D. stuttering expert I’ve worked with, yet rarely seen it in school speech-language pathologists or other non-expert clinicians. The attitude of school speech-language pathologists is usually, “Tell me what I can do to help this child.” The attitude of Ph.D. stuttering experts is, “I know everything and everything I know is right. I don’t have to listen to you; you have to listen to me.”
  • Belief bias is when people evaluate the validity of a given conclusion, either accepting or rejecting it depending on if it is consistent with their prior beliefs. For example, I read a study in a speech-language pathology journal in which the graphs appeared to contradict the article’s written conclusions. I asked the primary author for the data and ran simple statistics, which showed that the researchers’ conclusions were 180 degrees wrong. (The original researchers hadn’t analyzed their data with statistics.) The researchers weren’t interested in this. I wrote to the journal, as did several other people who’d read the article and saw the same problems I’d seen. The journal editor refused to look at the statistics or publish a correction, saying that the study had been peer-reviewed so must be right.
  • Framing is defining a problem in terms of the knowledge or tools one has. For example, I received a call from a young man who stuttered severely. He’d gone through a well-known two-week intensive stuttering therapy program, and been unable to produce any fluent speech in the speech clinic. The clinic let him go through the program a second time, for free. He still couldn’t produce a single fluent syllable. The clinic director then opened a closet and took out a delayed auditory feedback (DAF) device “to test for neurogenic damage.” The young man put on the headphones and, for the first time in his life, talked fluently (proving that he wasn’t a neurogenic stutterer). The young man asked where he could buy a DAF device. The clinic director refused to tell him, and insisted that his program was the best treatment for stuttering, and that the young man could go through the program a third time, for free. The young man said, “No, thanks,” found me, and bought a DAF device.
  • Self-serving bias “is the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.” In other words, experts think they are more powerful or capable than they actually are. For example, a stuttering expert might say that the way he does indirect therapy or stuttering modification therapy works, and if studies found that these treatments are ineffective then the studies were of lessor clinicians. It helps that 80% of pre-school children who stutter spontaneously recover without therapy, so even with ineffective treatment four out of five of the children recover and the expert takes the credit for the success.

Another example of self-serving bias is the often-heard line, “My stuttering therapy is 100% effective if the stutterer tries hard enough.” The Ph.D. speech-language pathologist thinks he or she is more capable than he or she actually is (“my therapy is 100% effective”), while blaming the stutterer when the therapy fails (“he didn’t do his exercises every day,” etc.).

In another example of cognitive bias, when I was 19 I did two weeks of fluency shaping therapy and then another week when I was 22. I learned to talk fluently in the speech clinic. I loved the relaxed, confident-sounding, fluent speech, but I couldn’t transfer it to conversations outside of the speech clinic. When I was 28 I started speech therapy with a brand-new Ph.D. speech-language pathologist who’d written her dissertation on stuttering. Twice a week for an hour she told me that as a severe adult stutterer I would never be able to talk fluently, and that I had to change my career plans and goals in life to not have to talk. I kept saying, “But at another speech clinic they taught me to talk! I want to work on that!” To me, fluency shaping proved that I could talk, but I needed more than three weeks of therapy. To her, my inability to talk fluently proved that fluency shaping was smoke and mirrors. We each interpreted my earlier experiences according to our biases.

OK, my headline isn’t quite right. Having a Ph.D. make people biased, not stupid. It’s only to the rest of us that they look stupid. Experts tend to join groups of other experts and mold their beliefs to fit the group; experts tend to know something about a subject and then look for information that supports their existing knowledge and reject information that challenges their existing knowledge; and experts tend to think that they are more capable or effective than they actually are.

Non-experts, in contrast, tend to think for themselves, look at new information objectively, and to not think of themselves as being capable of doing things they can’t do. On the runway in Tenerife, the pilot and co-pilot agreed with each other, ignored a radio conversation between the Pan Am pilot and the control tower, and thought they could take off safely despite being unable to see into the fog. The flight engineer, in contrast, thought for himself, paid attention to the radio exchange, and was apprehensive about taking off in heavy fog.

Cognitive biases explain general reasons why stuttering experts are often wrong, but don’t explain specifically why the consensus agreement of stuttering experts is to recommend indirect therapy for preschool children and stuttering modification therapy for older children and adults, both of which have been proven ineffective, and to disparage evidence-based, effective treatments. For an explanation of this, see my blog post Stuttering Experts Are the Used Car Salesmen of Speech Pathology.