Scott Alexander wrote yet more words defending his decision to write two posts totaling 25,000 words about Ivermectin. Then he wrote a second post trying again.
More centrally, his first post, of which I very much approve, is defending the most important idea of all: Think for yourself, shmuck!
I want to make clear my violent agreement with all of the following.
Think for yourself, shmuck!
When it seems worthwhile, do your own research.
The ones telling you not to ‘do your own research’ are probably the baddies.
Also applies to the ones telling you to ‘trust us and Trust the Science™’ and calling you an idiot or racist or calling for you to be censored if you disagree.
Baddies or not, those people still are more likely to be more right about any given point than those saying they are wrong or lying to you, unless you have seen them lying or being wrong a lot about closely related things previously. And your own research will often not seem worthwhile if you consider opportunity costs.
When people draw wrong conclusions like Ivermectin being effective or that Atlantis used to exist or whatever, telling people that they are idiots or racists for drawing that conclusion is not going to be super effective.
Pointing out ‘the other side are conspiracy theorists’ or ‘the people who believe this also believe these other terrible things’ does not prove the other side is wrong, nor is it going to convince anyone on the other side that they are wrong.
If you instead explain and work through the evidence, there is a chance someone might be convinced, that is God’s work, you are providing a public service.
There are not ‘legitimate’ and ‘illegitimate’ places to Do Science. You can virtuously Do Science to It, for all values of It and of You.
No, we cannot assume that the medical establishment, or any other establishment, will always get such questions right. That is not how any of this works. Even the best possible version of the medical (or other) establishment will sometimes get it wrong, if no one points it out without being dismissed as a conspiracy theorist or racist then the establishment will keep getting it wrong and so will you, and criticism is the only known antidote to error in such situations.
I would also add, from Kavanagh’s response to Scott in a comment, my disagreement with this particular thing, regarding scuba diving to purported Atlantean ruins:
I also don't think I would have the same intuition you have that personally exploring the ruins would be informative. I think that would actually be likely to skew my perspective as it feels like it would deliver potentially inaccurate intuitions and that it would require already having the expertise to properly assess what you are seeing.
Actually getting the skills, running experiments, seeing the evidence for yourself? That’s all great stuff in my book. It’s not cheap to do, but if you care enough to learn to scuba dive, by all means scuba dive and see the primary evidence with your own eyes. It seems crazy to me to think this would not be a helpful thing to do - to me it is the most virtuous thing to do, if you care a lot.
Alas, Scott then backtracks a bunch in this second post.
He is afraid others will see him saying not to ‘trust the experts’ so he wants to reiterate to trust the experts, that reasoning is hard and you probably shouldn’t try doing it yourself. Then he says this:
To a first approximation, trust experts over your own judgment. If people are trying to confuse you about who the experts are, then to a second approximation trust prestigious people and big institutions, including professors at top colleges, journalists at major newspapers, professional groups with names like the American ______ Association, and the government.
If none of this rings true, figure out whether you really need to have an opinion.
To a first approximation, you should never suspend the first approximation.
At its best this behavior is free riding. It will not often be at its best.
That whole speech, to me, is a Lovecraftian horror. If we tell young people to (almost) always trust the American [X] Association on X, and journalists about the news, dismiss anything that the authorities call a conspiracy theory, and never get any practice thinking for themselves on such matters, we deserve what we get.
I love that this is the top comment on the post, note inside the parenthesis:
Another objection I don’t buy is the idea that if you are seen giving too much credibility to conspiracy theories, you risk making people generally more vulnerable to conspiracy theories, by increasing their priors on conspiracy theories.
I have several objections to this objection.
You’re saying we should engage in a conspiracy to discredit conspiracy theories?
It is a very bad principle to avoid providing Bayesian evidence if you think this would move someone’s posterior in the wrong direction due to other mistakes.
This is a lot like (at least self-) censoring information that goes against things you believe to be true, on the theory that it is true and therefore it would be bad if people got evidence that made them believe it less.
What do you think about a field if every time they find evidence of X they publish and shout from the rooftops, and every time they find evidence against X they put it in a drawer? What do you believe about X? Does X being true make this better?
I am not convinced that ACX readers tend to give too much credence to conspiracy theories, or puts too little trust in the establishment’s claims.
I am not convinced that considering and then robustly dismissing well-known conspiracy theories will give more credence to such theories.
It lowered my credence for such theories, since I now have a clear data point on one of them. I expect many people, especially those who had previously had doubts about the particular theory in question, would react the same way.
Scott’s discussion of characterization of the three ways to view conspiracy theories - Idiocy (dumb things people fall for), Intellect (same as other theories, only worse, which is mostly the way I see them) and Infohazard (deadly traps that lie in wait) has this same assumption that the goal is to have fewer people believe things in the category ‘conspiracy theory.’ Which is why them being low status would seem good. That… doesn’t seem like the right goal, unless you think all conspiracy theory shaped theories are always false?
The objection of Kavanagh that I do buy, that I think is important, is that you need to read the contextual clues in a situation to know whether sources are worth treating as credible and worthy of your time. Otherwise, you’re going to waste a lot of time on questions where you already know the answer.
Was the full analysis of Ivermectin a good use of Scott’s readers’ time? If everyone who reads some of Scott read the whole thing, then no. If everyone made a reasonable personal decision on whether they found such an analysis of value before reading, then yes. The output was quite valuable to the right people, especially those who could be convinced. I also found it of value.
Was it a good use of Scott’s time? Less clear to me. My guess it the first analysis plausibly was, the second one probably wasn’t.
I was given the same test, here. In addition to Scott Alexander, Alexandros (who I am in no way saying is a ‘conspiracy theorist’ or any other label) talked extensively to me. His initial claims were thoughtful and interesting, and I engaged. It became clearthat he was applying asymmetric standards to evidence. A bunch of his claims weren’t checking out when I looked further. It then became clear he was also bringing other Covid-authority-skeptical standard takes, in particular questioning Pfizer and the vaccine, in ways I had looked into and knew were not good mistakes to be making.
I was confident he would have continued to talk to me and raise additional questions on this, and on vaccinations, and on other things, for as long as I was willing to listen. And I was convinced that he was not about to quit, or be convinced, no matter the response.
So after spending several hours on this, I concluded continued engagement was not a good use of my time, and I stopped doing it. I think that was a good decision process.
Initially I said quickly clear, but when I looked back it seemed it wasn’t all that quick.
I don't really disagree but let me say this: I spent like 30 years on the theory that if folks made good, rational decisions my ideas would work and if they made bad ones it was on them. It has never, ever, produced good results for anyone. Some acknowledgement of the machinations of Moloch need to be made for any large social problem and at this point I'd consider "people believing false things because they don't like the people who believe the true things" to be a large social problem.
One of the things that I've noticed when digging into a few conspiracy theories is that they "rhyme", semantically. They all seem to use the same tricks:
- "you can trust me, why would I lie to you?"
- "this expert was wrong in this one particular place, so nothing she says can be trusted"
- "I have a single interesting argument backed by facts, so you should believe everything else I say"
- "if you accept the conventional wisdom, you're a sheep. You don't want to be a sheep, do you?"
- "Here's a fact. Allow me to extrapolate from that fact to the nth degree"
If I were emperor, I would explore the idea of having students earning advanced degrees have to demonstrate their critical thinking skills by debunking a conspiracy theory. Not because the theory needs debunking, necessarily, but because the process of dissecting those arguments will set that student up to recognize the "rhyme and meter" of other conspiracy theories