45 Comments

"The other option is division of labor and outsourcing.

If you can find a sufficiently trustworthy secondary source that analyzes the information for you, then you don’t need to worry about the trust level of their sources. That’s their problem."

I don't think you intended it to but this finally convinced me to upgrade to a paid subscription. When worded this way it's clearly too valuable a service to expect someone to do it for free.

Expand full comment

"This seems very much like a Be The Change You Want to See in the World situation."

For more on this, please see my LessWrong posts on how to strengthen ones virtues of Honesty and Sincerity: https://www.lesswrong.com/s/xqgwpmwDYsn8osoje/p/9iMMNtz3nNJ8idduF and https://www.lesswrong.com/s/xqgwpmwDYsn8osoje/p/haikNyAWze9SdBpb6 respectively.

Expand full comment

It caught my attention to hear Bloomberg called out as reliable. I don't ever really pay attention to them, so the main thing that comes to mind when I think of them is the "Big Hack" story that (as I understand it) turned out to mostly likely be false, but that it's just an honest failure as will occasionally happen with investigative reporting, as opposed to an intentional narrative-fitting deceit.

For the most part, the the fact that I don't really think about them at all or have an opinion on them suggests that they haven't been feeding the zeitgeist of anger, so it's a point in their favor. I may try checking them out periodically.

Expand full comment
Feb 3, 2022·edited Feb 3, 2022

This article and Scott's are going to be required reading for my scouts working on the Citizenship in the Nation merit badge, which has a *very* outmoded media literacy component. Fantastic read.

Also, I think you should add Scientism to your lexicon. It's an idea you use often.

Expand full comment

I’ve been following economist Paul Krugman for 20+ years, and have found he has an excellent track record for predictions…even for things that are out of his lane, like the misinformation used to justify the Iraq war.

He explains his reasoning, shows his data, and admits when he was wrong (like when he predicted the Internet would not be important).

Yeah yeah…he’s best known as a columnist for the dreaded NYT, but he’s also on Twitter, if anyone wants to give him a fair shake.

Expand full comment

This is a legendary post, and not because if it was merely good the living would envy the dead. If you have ever thought that you would like to write a book but didn't know what would be a good topic, I would highly recommend this. I honestly don't think people understand these sorts of issues well anymore, not the way they used to, and this was masterfully done. Thank you.

Expand full comment

> He doesn’t draw any parallels to the past, but his version of bounded distrust reads like something one might plausibly believe in 2015, and which I believe was largely the case in 1995. I am confused about how old the old rules are, and which ones would still have held mostly true in (for example) 1895 or in Ancient Rome.

I'm somewhat worried about Gell-Mann amnesia here. We see people lying about things we know about (the events of the recent past), recognize it for nonsense, then blithely assume that other people in a substantially similar incentive structure - and occasionally actually the same people in the 2015/1995 cases - weren't doing the same thing for the same reasons. I don't know for sure this *isn't* true, but I'd also only be moderately surprised to learn that the "new rules" are much older than this post assumes.

Expand full comment

This article is absolutely fascinating to me and I'm glad you wrote it.

I am a person who falls into a bunch of the "Incorrect Anti-Narrative Contrarian Cluster". I obviously believe I am correct. But reading through your essay I was starting to get some cognitive dissonance because you're making really good arguments about things that I had not considered before.

About halfway through this article, it hit me what's going on. We have nearly identical reasoning processes, but come to opposite conclusions because of different weights on different priors.

Take this for example

> So yeah, anthropogenic global warming is real and all that, again we know this for plenty of other good reasons, but the reasoning we see here about why we can believe that? No.

...

> This is not the type of statement that we can assume scientists wouldn’t systematically lie about. Or at least, it’s exactly the type of statement scientists will be rewarded rather than punished for signing, regardless of its underlying truth value.

To simplify the example, two priors. 1: "I am intelligent, I have checked the data for myself, and it's good". 2: "I don't know if scientists _did_ lie about this, but they totally _would_ based on their incentives and their demonstrated past behaviour"

You weight (1) stronger. I weight (2) stronger. You trust in your own ability to evaluate scientific research independently, more than I trust in my ability to do the same. I believe public officials' propensity to lie is higher than you believe it is.

This is absolutely fascinating to me and, I think, contains a lot of explanatory power for how I came to be in such violent disagreement with many of you on many of these issues

Expand full comment
Feb 4, 2022·edited Feb 4, 2022

I don't see any reason for believing politicians are any more dishonest than the media in any general sense, and I don't think this 2-week time frame makes a lot of sense. Politicians' incentives are different than the media to the extent that their constituency is different, not in some deep way. And most politicians do actually seem to be pretty careful about not straight up lying about disprovable physical truth in exactly the same way as the media. In act, they have more of a disadvantage than the media in the dishonesty game, because they actually have to live in the real world enough to understand how to win an election. This is probably somewhat less true of non-establishment politicians, who may not know the rules, or may have constituencies that are systematically different than those of establishment politicians in ways that make lying about ground truth more worth the risk. But on the flip side, experienced politicians running for high-visibility office in swing states probably have to be *more* honest than any given news source, because they have to appeal to a broader constituency and are more open to attack if they lie.

2 weeks is arbitrary and I think for someone who lied about ground truth in an obvious way, getting caught at any time would hurt them. Brian Williams only got demoted about 4 months after it came out that he'd been lying for years about getting shot at/shot down in Iraq. That was 2015 (old rules?) and he did get caught 4 days after his latest telling of the story, but importantly it didn't just blow over after 2 weeks.

Expand full comment

Thank you for writing this. I found it worthwhile and insightful.

What I don't feel confident about at all is how to avoid ending up in an epistemic defensive crouch where my priors become unshakeable. I don't want to trust untrustworthy sources, sources mostly seem untrustworthy, and the ones that seem most trustworthy generally are ones that basically align with my worldview (presumably this is a common problem, given how typical my mind is). That's a great recipe for not being fooled if I'm currently not being fooled. It's a terrible recipe for escaping if I'm *currently* being fooled.

How would I even know the difference?

I've been leaning really hard on the "these people seem to be trying to help me figure out how much to trust them" angle--anybody who puts an "epistemic status" tag at the top of their thing, for instance. But other than just looking for people who end every sentence with a question mark, I'm pretty low confidence that the heuristics I've figured out aren't just me justifying my own preconceived notions.

Anyway, thanks for articulating a lot of what I've been feeling!

Expand full comment

Really good article, and I really enjoyed it.

One thing that does end up bothering me about your and Scott's posts is they both describe an informational world that's hopelessly broken because of dishonesty in such a way that maybe at best 1% of people can, with tremendous effort, know kind-of-sort-of what's going on in a few niche subjects. And then both articles just... stop. I'm always waiting for that "and thus this is lying, and lying is bad, and we shouldn't do it, and maybe we could try to change the world in X way" practical component, but it never really comes.

In rationalist circles, that practical component is basically never proposed, and I'm not sure why. Money can end more or less suffering leads to EA. Neurons are consciousness leads to Scott recommending only eating the very biggest animals, or something. Lying having no significant downsides has made all our news and the entirety of science unreliable except for like 3 guys leads to... nothing, every, it's an assumed natural law that doesn't respond to norms and shouldn't be challenged.

It's always been weird to me, because anything even kind of rationalist-adjacent absolutely needs data to function at all; bad data breaks the whole system. This is sort of the biggest problem to knowing anything or doing anything effectively that we could imagine, and the furthest we are generally willing to go is describing the problem, describing the incentives that caused it, and completely refusing to imagine the kind of incentives that might un-cause it.

Expand full comment

My heuristic is a bit simpler: If any source that is a professional user of words uses hedges (could, would, should, etc, etc) outside of a context that demands it (explaining uncertainty) they almost are certainly lying and you should dig deeper.

The other major tell is mixing units in the same story (totals vs percent) or otherwise obfuscating actual data points (narrative about how good/bad a particular metric without trends over time).

Expand full comment

2020 broke "don't pay attention to the news for me."

Obviously, COVID restrictions dramatically affected real life in a way that politics never did before. You just couldn't ignore it. COVID basically radicalized me.

Also, the ongoing race and gender Cultural Revolution. A little bit of cynical affirmative action is dumb and unjust, but not the end of the world. But it seems like "cynical apathy" just wasn't a stable equilibrium. I've come around to the idea that we have to "Face Reality" as Charles Murray says. People who can't process the relevant facts have attempted to world build an understanding of how things are while ignoring important facts and they built crazy world models that demand them to force crazy things on the rest of us.

It may be harder for people that don't have young children to understand this, because both of these items are really really bad in the schools right now compared to regular adult life.

Anyway, it's not clear what "obviously disprovable factual statements about the world" means. If something involves multiple regression and cause and effect statements, even if the evidence is really strong in one direction, its harder to "definitely" prove then "X% of Y at time Z". COVID and race/gender are often in that category. It's easy enough to provide a level of evidence in support of a claim that a reasonable person would accept, but not enough to 100% shut down someone engaged in motivated reasoning.

Expand full comment

> But what about the global problem as a global problem? Sure, politicians have mostly always lied their pants on fire, but what to collectively do about this epic burning of the more general epistemic commons?

There's also a problem with AI of course. It seems rather inevitable that in a few years we're screwed if something is not done, even if further advances in the field stall for some reason.

This is a nice take on a problem, if a little dated (ok, it's a few months old, but this timespan is subjectively years now...): https://youtu.be/oppj9MdNf44

Expand full comment