Do you think scientists are better than most people at critical thinking? Think again.

Reijo Salonen and I had an enjoyable and lively session during our talks couple of weeks ago in Turku. His topic, “How to Successfully Fail in Drug Development” sparked a lot of interest!

Drug development is very challenging, and it starts from the beginning of the process. The striking finding is that more than 80% of results for target identification in academic laboratories are not replicated in an industry setting. Once a molecule is ready to go into man, 50% of drug candidates fail in Phase 3 clinical studies. Also, the external environment changes along the way, with competitors appearing, regulations changing, and key advocates leaving the company. 

What can be done about it? Reijo discussed several options to improve the chances including using human in vitro data as much as possible, using adaptive clinical trial designs, getting an outside review of your projects and communicating with stakeholders every step of the way. 

But why does it happen in the first place?  We discussed some possible reasons during my talk, “Don’t Confuse me with Logic: Critical Thinking on the Path from Innovation to Commercialization”.

Mental shortcuts lead to errors

One way to define critical thinking is the avoidance of bias. You might think that scientists are better than most people at critical thinking, but this is not necessarily true. In everyday life, we use mental shortcuts (or heuristics) to make quick decisions for the many encounters we experience. This is an innate process essential to our survival. And most of the time, it’s just fine.

However, these shortcuts lead to errors in our thinking, and when this happens systematically, we become biased. Heuristics lay many traps for us, but with awareness and slowing down your thinking, you can apply logic and rationality and make better decisions. The heuristic pitfalls we considered were:

  • Availability (use only information I know),
  • Affect (my gut feeling),
  • Anchoring (my best guess, not knowing its accuracy) and
  • Representativeness (the Law of Small Numbers). 

This is how they work

It’s easy to see that publication bias and confirmation bias arise from the Availability Heuristic.  Most of the results in publications are positive and confirmatory. If you’ve ever had trouble giving up a poorly planned home improvement project, your Affect Heuristic is at work with a ‘sunk cost’ bias.  When you assign a “prior” using Bayesian statistics, be alert to the Anchoring Heuristic. If the estimate is not right in the first place, the experiment may fail. 

For Representativeness, we discussed the Daniel Kahneman’s concept of the “Law of Small Numbers.  This “law” refers to incorrect belief, by experts and laypeople alike, that small samples ought to resemble the population from which they are drawn.  While this is true for large samples, it is NOT true for small samples, yet it’s the most common fallacy used when people make decisions. 

If you describe a friend who is quiet, shy, keeps to himself, and likes order and process – then ask a colleague to guess whether he is more likely a librarian or a farmer – most people guess librarian.  However, there are 20 times more male farmers than male librarians, so that guess is probably wrong.  Similarly, when a significant effect is observed in an experiment with a small number of subjects, we tend to generalize it to the larger idea. 

All of this might seem discouraging, – but take heart!  Once you are alert to your biases, you can take important steps to overcome them. Reijo’s talk described some of the remedies that are specific for drug development. For any other endeavor, ask yourself: could I be wrong?  Am I really considering another point of view? What’s missing? 

And slow down your thinking to take full advantage of the logic, knowledge, training and analytical skills you’ve worked so hard to gain.

Ellen Strahlman, InFLAMES Visiting Professor