Hacker News new | past | comments | ask | show | jobs | submit login

Michael Crichton calls these "wet streets cause rain" stories in his piece "Why Speculate." http://larvatus.com/michael-crichton-why-speculate/

From his article:

Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. (I refer to it by this name because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, than it would otherwise have.)

Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.




I remember when I started noticing that the occasional time a newspaper wrote about something I knew something about, they usually got some vital details wrong, that really undermined my trust in that newspaper. I've got a better newspaper now, but that first one is generally regarded as a pretty decent one. But when they always get the details about stuff I know wrong, how well can I trust the articles about things I don't know?

Of course it's also possible I only remember the times they get it wrong, and I don't notice the many times they get it right when writing about something I know something about, leading me to underestimate the paper's accuracy.

I'm sure somebody somewhere is willing to make up a name for that effect too.


Perhaps Gell-Mann moved from a paper that was bad at writing about physics but good at writing about Palestine, to a paper that was good at writing about physics but bad at writing about Palestine. This kind of transition seems like it might help maintain a filter bubble.


This is certainly a risk, but unfortunately we can extend the argument to suggest that you can't ever trust any source. Say Gell-Mann wants to find a paper he can trust on Palestine. He's good at physics, and checks reporters covering both topics.

A reporter might be good or bad at covering each topic, which leaves us four cases to consider. If they're bad at physics, they could be a specialist in Palestine, but they could also be generally incompetent; Gell-Mann has no way to tell. If they're good at physics, they could just be talented reporters, but they could also be physics specialists who don't know Palestine any better than Gell-Mann does. And once again, he has no way to tell.

Even probabilistically, we could argue for either approach. Most people aren't experts in more than one thing, and it's easier for an expert than a random fool to garner attention for a baseless claim, so perhaps we should especially distrust good physics writers on Palestine. But incompetence is broadly correlated, and journalism skills apply to both topics, so perhaps we should view bad physics writing as a sign of weak fundamentals and distrust it everywhere.

(I'm talking about reporters instead of papers, but we can push the argument back a level easily; editors have to hire reporters for fields they don't know.)

Going alone, all I can see to do is to look for people who claim to specialize in a few things, one of which you know well, and trust them on their other specialties. As a society, we can perhaps do better by asking a bunch of experts who's competent in their domain and looking for alignment - provided we can all correctly agree on some experts in advance.

edit: a bit of searching suggests this is basically Berkson's Paradox. If we (boldly) assume that news sources which are bad on all topics don't circulate, then quality in one area lowers the expectation of quality in other areas.


Although there is another interpretation (especially in traditional newspapers): you aren’t just evaluating the the individual journalist, you’re evaluating the editorial staff. They are responsible for finding a physics expert to write about physics and a Palestine expert to write about Palestine.

If they do a poor job of selecting a physics expert, then it seems likely that they will do a poor job of selecting other kinds of experts as well.


Hm, lots of options going - negativity bias (bad stuff is more memorable), conservatism bias (low-frequency events are overestimated), availability heuristic (memorable events like misrepresentations are overstated).

Honestly though, I'm not sure I'd call it a bias in your observation so much as a sensible assessment of sources. A source that's right about 90% of topics is still misleading you quite often. And even worse, a source that's 90% accurate about each topic can leave you almost totally ignorant; there are a lot of stories where the entire message can be destroyed by any one of numerous errors.

(And of course, there's a pithy name for that too. "O-ring theory", after the Challenger shuttle disaster, describes phenomena where everything has to go right, so the failure chance at each step is multiplicative.)


I think that's just confirmation bias working in tandem with the amnesia effect described, but certainly it is a valid point. To me, the hardest part about this problem is that trust tends to be stored (at least in my brain) as a binary yes or no, though possibly with a fuzzy border and some room in the middle. It's tough for me to read something and internally modify my trust of the source by a proportional amount for getting things "right" that I "know" when even what I know is uncertain because I could be wrong.

The end effect is that I tend to double check just about any source, but I still have a handful of sources that I have strong reason to believe their motives are compromised on various topics where I expect their motive to conflict with the motive to tell the truth. It's still not great though because sources rarely move up in this system and frequently move down, which leads in a general inability to find information considered trustworthy. This works fine on things where there is a general consensus because I'll be able to find a varied set of sources saying the same thing, but not so great on hotly debated topics with lots of nuance. Also sometimes my brain is lazy and I don't do any of this processing because I'm an imperfect human.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: