Michael Crichton coined the term Gell-Mann Amnesia effect to describe forgetting how unreliable a source is in one area when you trust it in another area. In Crichton’s words:
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray [Gell-Mann]’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
I think about the Gell-Mann Amnesia effect when I read news stories that totally botch science or statistics. Most of the time when I read a news story that touches on something I happen to know about, it’s at best misleading and at worst just plain wrong.
Yesterday I had the opposite experience. I was trying out a new podcast, not one focused on science or statistics, that was mostly correct when it touched on statistical matters that I’ve looked into. They didn’t bat 1000, but they did better than popular news sites. That increased my estimate of how likely the podcast is to be accurate about other matters.
By the way, why is the effect named after the Nobel Prize-winning physicist Murray Gell-Mann? Crichton explained
I refer to it by this name because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, than it would otherwise have.
6 thoughts on “Gell-Mann amnesia and its opposite”
Fortunately, John, in this blog you talk about many things I have no knowledge of. :)
I’ve had the reverse happen as well. I’ve always been a fan of Consumer Reports, primarily their user survey-based reliability data, which can sometimes uncannily predict failure areas in cars, etc. I occasionally drift towards giving their editorial review content too much weight, and this tendency is corrected when I read one of their reviews in a content area I’m more of an expert on.
There are several variants of this phenomenon: a great scientist pronounces on political issues — he must be correct, right?.. because if he can derive the Theory of General Relativity, surely analyzing society will be a doddle. If he can work out the nature of molecular bonding, international affairs will be a snap. And yet Einstein’s suppport for socialism, and Linus Pauling’s assurances about the peace-loving nature of the USSR, were idiotic.
So that journalist who doesn’t know a mean from a median, may yet be a good reporter of some factual event. Or he may not, if said factual event triggers his political feelings. But in the latter case, it won’t be technical ignorance, but the inability to rise above his biases, that will be the problem.
Snopes is like this. I pretty much trust them on non-political matters of fact, say, about the Bermuda Triangle. But where political events are concerned, they do not lie outright, but invariably bend the stick as far as possible to give the events a leftist interpretation. Their choice of words could provide material for a course in writing political propaganda — like MacBeth’s witches, they can lie by telling truth.
Authors and playwrights use this effect. David Mamet discussed it in one of his essays. Working in some accurate technical description/detail early in the work helps establish the story teller’s credence. Mamet often does this with his characters explaining how a con or sales system works. I remember Lanford Wilson doing this in The Fifth of July when he explained how roses are propagated from seed. I’ve seen this in all sorts of fact and fiction.
It’s a great technique. It works. Still, just because someone knows X doesn’t mean they know Y. In the real world, we hire specialists who are very good at X. Odds are that people with mathematical problems hire you and your firm to solve them, but they have the good sense to call a plumber or visit a dentist when a different set of skills are needed. (No offense here if you actually are a consummate plumber and a magician with bridgework. Just choose other examples.)
I suppose this effect is the handmaiden to celebrity influence peddling. What does an actor know about global warming or C02? Why should popularity lend any authority or expertise to a subject they may know nothing about?