Consensus Cascade: Fear the “Experts”
The New York Times piece, “Diet and Fat: A Severe Case of Mistaken Consensus” explains how a scientific “consensus” came into being that a low-fat diet was best (despite evidence to the contrary) (via Dale Light). How’d it happen?
We like to think that people improve their judgment by putting their minds together, and sometimes they do. The studio audience at “Who Wants to Be a Millionaire” usually votes for the right answer. But suppose, instead of the audience members voting silently in unison, they voted out loud one after another. And suppose the first person gets it wrong.
If the second person isn’t sure of the answer, he’s liable to go along with the first person’s guess. By then, even if the third person suspects another answer is right, she’s more liable to go along just because she assumes the first two together know more than she does. Thus begins an “informational cascade” as one person after another assumes that the rest can’t all be wrong.
Because of this effect, groups are surprisingly prone to reach mistaken conclusions even when most of the people started out knowing better, according to the economists Sushil Bikhchandani, David Hirshleifer and Ivo Welch. If, say, 60 percent of a group’s members have been given information pointing them to the right answer (while the rest have information pointing to the wrong answer), there is still about a one-in-three chance that the group will cascade to a mistaken consensus.
Cascades are especially common in medicine as doctors take their cues from others, leading them to overdiagnose some faddish ailments (called bandwagon diseases) and overprescribe certain treatments (like the tonsillectomies once popular for children). Unable to keep up with the volume of research, doctors look for guidance from an expert — or at least someone who sounds confident.
Hm. Sound familiar? As Dale Light notes:
We should remember that “science” is conducted by human beings with all the weakness and fallibility that entails, and that credentialed “experts” are often disastrously wrong. With that in mind we should recognize that expert opinion is a weak and shifting base on which to construct public policy.
While this is true in the hard sciences, it is especially true in the social sciences, which are less empirical no matter what anyone says. Usually the prescription written to solve a societal ill is more test run than cure. The reality is that it usually takes years, decades or even centuries for good ideas to percolate and solidify into something that works.
Quick examples:
Eggs
ADD/ADHD
Bipolar disorder
Margarine vs butter
High Fructose Corn Syrup
Hydrogenated Oils
Global Warming