I don't really disagree but let me say this: I spent like 30 years on the theory that if folks made good, rational decisions my ideas would work and if they made bad ones it was on them. It has never, ever, produced good results for anyone. Some acknowledgement of the machinations of Moloch need to be made for any large social problem and at this point I'd consider "people believing false things because they don't like the people who believe the true things" to be a large social problem.
Okay. I just wish comments had more substance. Like walk through an actual example to demonstrate the point. Maybe you're dropping wisdom? It's hard to tell...
I'm trying to break myself of the habit of writing a book as a comment. But, for example, it's very easy to make a claim like "if a subscription model social media network provided more value than an advertising model, then folks would pay for subscriptions. They don't, so clearly either the downsides of an advertising model are less destructive than the cost of a subscription or folks are making poor content choices. Either way, it's the individual's responsibility to make the best choices for themselves, so larger institutions shouldn't intervene in that choice."
But of course the preference for advertisement isn't built individually. The whole point of social media is to be networked with people you want to be networked with. And so you may decide to join an advertisement-based network for those reasons. Or you may join because you're overvaluing short-term loss aversion when you'd actually be happier with a subscription model in the long-run. At any rate, it doesn't matter from a consequentialist point of view because the fact is that people are choosing an advertisement-based model to their own detriment, and that's hurting even folks who are acting rationally, because it's building a culture that is obsessed with attention-capture at the expense of actual value.
To bring it back to this post, I absolutely agree that there are massive downsides to telling people to accept official narratives in lieu of their own judgment. Except that most people are awful at reasoning. Zvi and many other rationalist-bent folks tend to respond to that by saying "well then people should be better at reasoning and if not it's on them." I understand that point of view incredibly well but the fact is that "make people better at reasoning about every single issue that might pop up on a facebook news feed" is an incredibly difficult, borderline impossible goal. Meanwhile, conspiracy theories abound, because people's poor skill at reasoning and emotional reaction to content causes them to want to embrace untrue narratives. And saying "Well on their own heads be it" ignores the fact that this forces us to live in a society where stakeholders believe nonsense and make choices that affect all of us based on that nonsense. It's a consequentialist disaster.
It's possible (I think probable) that it would be worse or exacerbate the problem to push a "trust experts" narrative to counter this, but it does need to be countered. All I can say for sure is "do your own research" is not actually successful for most people, who suck at contextualizing that research and putting it into perspective.
> All I can say for sure is "do your own research" is not actually successful for most people, who suck at contextualizing that research and putting it into perspective.
To further complicate that, pretty much nobody is actually doing first-order research. Professionals are doing the research and the "skilled-rationalists" like Scott and doing a meta-analysis, and the less skilled are doing some kind of tribal popularity contests in determining which experts to believe. So in practice everyone _already_ is listening to _some_ form of expert and the problem is actually choosing amongst competing opinion. That make both the advice of "listen to experts" or "do your own research" somewhat intractable. We aren't actually crowdfunding double-blind studies or whatever.
One of the things that I've noticed when digging into a few conspiracy theories is that they "rhyme", semantically. They all seem to use the same tricks:
- "you can trust me, why would I lie to you?"
- "this expert was wrong in this one particular place, so nothing she says can be trusted"
- "I have a single interesting argument backed by facts, so you should believe everything else I say"
- "if you accept the conventional wisdom, you're a sheep. You don't want to be a sheep, do you?"
- "Here's a fact. Allow me to extrapolate from that fact to the nth degree"
If I were emperor, I would explore the idea of having students earning advanced degrees have to demonstrate their critical thinking skills by debunking a conspiracy theory. Not because the theory needs debunking, necessarily, but because the process of dissecting those arguments will set that student up to recognize the "rhyme and meter" of other conspiracy theories
It's good to draw a distinction between the *case for* a conspiracy theory, the way it is presented, and the theory itself. Certainly when you see someone engaging in the CS-grift-hype-snow cycle, or whatever you would best call it - the thing you call rhyme and meter - you should downgrade the source accordingly.
To me, this sounds a lot like what most people say regarding more standard theories, to the point that a regular layperson may have trouble distinguishing between the Standard theory and the Conspiracy theory. Even big name experts often say similar things, like change your second line to "this other expert..." and the fourth to "if you accept [other theory] you're a rube..."
"Expert" is not a real category in many disputes. It's also extremely fuzzy when multiple experts disagree with each other, as is very often the case. How do you say that ivermectin doesn't work when hundreds (thousands?) of doctors and scientists are trying it with active patients and several countries have added it to their official treatment advice? Personally, I say that you *don't take a stance* on that question because it's still open. Even if you're a doctor working actively on testing it, you say something like "early results indicate..." Right now, my feeling on ivermectin are "pretty unlikely to work, maybe for certain people in certain situations that we don't fully understand." There's not much harm to be found there, either way. If I was prescribed it by my doctor, I would be willing to try it (as with most things he might prescribe to treat what I went to him to fix), but wouldn't pressure him to prescribe it.
Trying to do otherwise becomes an exercise in identifying who the "real" experts are, which unsurprisingly aligns almost perfectly with which experts say the things we already think are true.
It feels like this post might not have digested David Deutsch's The Beginning of Infinity or at least it's Popperian sections sufficiently.... It's as if there is an assumption that we can do better than dialogue and critique . Deutsch would disagree and it's hard to argue with him once the case is laid out
I have a friend who holds several conspiracy theories and he always asks for certainty from the positions that oppose him, and believes he has exposed them when they can't provide it. So he believes the moon landings were faked and just because the scenes could have been filmed in a studio seems to him proof that they were. Isn't the beginning of rationality an agreement to accept the same standards of evidence on both sides of an argument?
It would be good to lay out which angle or paragraph of the post offends Deutsch, the nature of the offense, and the proposed remedy. What is beyond dialogue and critique that is embraced, incorrectly, here?
> Isn't the beginning of rationality an agreement to accept the same standards of evidence on both sides of an argument?
Who are you questioning here? This post argues for symmetric standards of evidence:
> I was given the same test, here. In addition to Scott Alexander, Alexandros talked extensively to me. His initial claims were thoughtful and interesting, and I engaged. It became quickly clear that he was applying asymmetric standards to evidence.
I did not read Deutsch. I could be persuaded to with a high enough combined bid from people I trust, seems like a reasonable thing to do.
Perhaps this is, as came up in another comment, the difference between a theory and the typical method of arguing for that theory? If someone says 'if you don't prove X then that proves ~X' or things like that, it is very hard to seek truth usefully with them... but also that only sometimes is strong evidence for X.
And the opposing view seems to be to dismiss arguments and evidence from the wrong sources, and to instead defer your beliefs elsewhere, which doesn't sound to me like symmetrical standards of evidence in any meaningful way, either.
David Deutsch's book "The Beginning of Infinity" explores Karl Popper's views on epistemology, the acquisition of knowledge, and the limits of truth. Here's a summary of Popper's ideas:
Epistemology: Popper believed that knowledge is not acquired through observation alone, but rather through the process of conjecture and refutation. He argued that scientific knowledge is always provisional and can never be completely certain.
The acquisition of knowledge: Popper argued that scientific theories can never be proven to be true, but they can be falsified. The process of scientific discovery involves formulating hypotheses, testing them, and then attempting to falsify them. This process leads to the development of new knowledge and the rejection of old, inadequate theories.
The limits of truth: Popper believed that there is no such thing as absolute truth or knowledge. Instead, our understanding of the world is always limited by our perspectives, our methods of inquiry, and our ability to formulate testable hypotheses. He argued that science is an ongoing process of approximation towards the truth, and that we can never be certain that we have arrived at the final truth.
Overall, Popper's views emphasize the importance of critical thinking, empirical testing, and the recognition of our own limitations in the pursuit of knowledge. His ideas have had a significant impact on the philosophy of science, and continue to be influential today.
Gotcha. I assume from your posting it that you endorse this summary. Sounds like there isn't anything here that I disagree with or don't already know, so no need to read the book at this time.
Well said. just to add: reading Scott's 25k words about Iv. was a lot of FUN, because: Scott, obviously. Interesting, too - maybe enlightening even, dunno. See, I have a Masters in FLT (German as a foreign language) and minors in History/Theology. And a patreon of "acoup". - None of that matters. As in: really matters, pays my bills or whatever*. But it tickles my brain. What else life is for? - 98% of Scott's "more-than-you-want-to-know"-readers are there for refined infotainment, I guess.
(*it gave me needed credentials. Early on I suspected FLT is no science. Nowadays I know on top of that: at college it is not even a craft.)
- Otoh: Scott should have ignored Kavanaugh, maybe. But then: epistemology, important kinda. -
Takeaway from covid: The biggest issue was never whether there were good/reasonable reasons to mask and to lockdown, but whether the downsides were given enough thought (and vice-versa). And the authorities for one aspect were usu. utterly ignorant about the others. Thus: No one was "expert" enough. Well, overall you became one of the best. (I just don't know the others)
Just to clarify I'm not putting words in Deutsch's mouth. I'm saying that if one absorbed the essence of his book the original piece might have been conceived and framed differently. I will try to point out some of the more obvious sentences but the profound differences are in worldview or background assumptions more than specific points
“If we tell young people to (almost) always trust the American [X] Association on X, and journalists about the news, dismiss anything that the authorities call a conspiracy theory, and never get any practice thinking for themselves on such matters, we deserve what we get.”
Yes, and the sad thing is, I think it (used to be, decades ago) more or less true that if you were not interested in / not able to look into complex topics yourself, you would more often than not get a good result by trusting the American X Association on X and journalists about the news. (Although occasionally things labeled conspiracy theories have turned out to be true. And it’s always been beneficial to use your brain and think for yourself when something doesn’t ring true.)
In general “trusting the experts” was a useful rule of thumb for the many, many things that we don’t have time to research for ourselves.
That is no longer true -- _really_ no longer true -- on many issues of importance (for instance when I hear such cheery reports on the results of EPA testing in East Palestine OH, I simply don’t believe it).
I don’t think it’s melodramatic to say this deserved loss of trust portends the unraveling of our society and culture.
If the only way you believe you can get accurate information is to look into every issue yourself -- well, ain’t nobody got time for that (not for everything that matters).
And so we end up being in a state of ignorance, knowing we’re in a state of ignorance (do I know anything about the air and water in East Palestine, Ohio? No) and unable to make good decisions. Perhaps the default decision becomes “freeze and do nothing,” because you don’t know whom to trust-- or perhaps you perceive “each ‘side’ has its own agenda and therefore none of them is to be trusted” -- and I believe that’s what happened to a lot of people with covid, especially in the US.
This yielded very bad results, if you measure US outcomes against the rest of the world’s.
I don’t know how or whether it’s possible to repair the damage. I doubt it is. We have a ruthless plundering profit-seeking class, and they will carry on until there’s nothing left to be gained. In the meantime, the truth of any matter be damned.
(1) First, do we all agree that the idea that Ivermectin might be effective against Covid is a CONSPIRACY THEORY? My impression from the Scott-Alexandros exchange was that it was a live scientific debate, but that Scott and most US medical experts think it's likely not to be significantly effective.
(2) Second, I'll offer myself as a test case reader.
(a) Scott waded into this area, conceded that there are a lot of studies indicating that Ivermectin might be effective, rejected many but not all of those studies, and then proposed a novel theory that Ivermectin shows effectiveness in the developing world because that population is more likely to also be infected with parasites.
(b) Then Alexandros made a bunch of very detailed criticisms of Scott's analysis, and Scott ultimately conceded that while he didn't agree with all of Alexandros's criticisms, he had unfairly dismissed some studies, and that the case for parasitic co-infection as an explanation for Ivermectin's results in developing world Covid studies is not accepted by at least one parasitic infection expert.
(c) As a result, I've updated to "Ivermectin is probably still not significantly effective against Covid, but the case for effectiveness is much stronger than I had been told." If we were still living in a pre-vaccine, pre-Paxlovid world, I'd probably stock up on Ivermectin just in case my family caught Covid, on the theory that it probably won't help, but that in appropriate doses, it's more likely to help than harm.
Now, if (c) is bad, then I guess Scott's interaction in this debate is bad. I'm not sure if it is or isn't.
It's probably my error, but by getting into more detail, and then conceding he was wrong on some important points, Scott raised my general assessment of his personal credibility, but also reduced my assessment that he's correct to discount Ivermectin in this specific case.
By contrast, when Zvi and Scott disagreed on the likely effectiveness of Vitamin D, my guesstimate was that Scott was closer to correct, in part specifically because he kept his argument at a simple, difficult to disprove level like "Vitamin D is often associated with positive health outcomes, but almost always disproved at the end."
Thanks to you and Scott for doing it. I'm most interested in the question of how we should form beliefs, so reading two people engaging in a subject that is over my head was helpful. :) (You can see that I'm engaging with the debate at a pretty removed level, and not trying to solve who's right where you and Scott disagree.)
I don't think Scott or Zvi think that the case for Ivermectin effectiveness against Covid is a conspiracy theory. Instead, I think they moved on to the topic of "should we engage with conspiracy theories," but they're not very precise about that.
Given that they are "moving on" after Scott's engagement with me, on the basis of CK's tweet, which was about me, I have a hard time seeing how the context doesn't apply to me. Besides, Scott has been painting me in that light for a while [1]. And even if somehow these spectacularly capable communicators don't "get" that that is the result of what they write, it ultimately matters not. The message that is being sent is that pushing back on the top rationalist sensemakers will get you branded as a conspiracy theorist.
I, like you, am mostly interested in questions of social epistemology. Much of my motivation was that disagreeing with Scott was a very strange situation for me, given the great respect I had for him, so the only reasonable thing to do was to dive in and figure it out. Seeing the result, I'm deeply concerned as to why people don't seem to care about sending a message that criticism isn't welcome, and the fragilization of the community that will bring. Sadly, one of the explanations, and the one I find most plausible, is that in the decade I stopped paying attention to the rationalist community, that kind of move has been played many times over, and it no longer raises an eyebrow.
A pattern that's taken me many many years to recognize as toxic: putting down the other side instead of offering any attempt of argumentation. It sounds brutally obvious when put like this, but unfortunately even Zvi is doing it quite frequently. I tend to swallow it from Zvi because:
- he's more than compensating by offering good analysis in other instances
- he tends to be right
- time is a limited resource, and some of his posts would have to be twice as long to avoid this.
It's not a great thing, though. But imagine whole books written with nothing but sneer. They're out there, and it's a god damn annoyingly hard thing to recognize as wrong. I think it has something to do with our social brain - if the author manages to suggest that the people claiming X are inferior, you have a hard time accepting X may be correct and the author just full of shit. It would mean that you risk putting yourself in the same class as the inferior people, and, perhaps unsurprisingly, your brain will strongly resist thinking that.
I am sad that I do this as much as I do, and would like to do it less. The inside perspective on what is going on when this happens is that I am assuming my readers already know the arguments necessary, or I've already given them elsewhere and am taking them as given, or at least I think this is the case - I like to think I am assuming facts previously introduced into evidence. And you have to do at least some of that structure building, if you want to accomplish anything.
If I'm doing this type of thing in unreasonable fashion, by all means call me out on it when you see it.
And yeah, those books exist, oh boy do they exist.
Like Dave, I also noticed this in the early Covid posts. In some cases it wasn't even very clear what I should believe, only that the tone indicated it was an obvious answer. Those were weird times indeed - a few days/hours difference or reading a certain blog could give you a whole lot more knowledge on a topic than weeks of thinking about it - so yeah, it was quite possible to have huge knowledge asymmetries with readers, and the inside perspective your describe fails.
The main problem with that is that's... very effective. My brain was perfectly capable to notice the lack of arguments, and it did, and still went into full "I gotta find out what zvi believes so I know what to believe" mode. It quite probably even got away with it a few times.
I guess it wouldn't have lasted as long is it wasn't a great rhetorical device. Touches the social brain, makes the reader feel special, has built-in protection from the opposition by painting them as low status and, best of all, gives you an absolutely tiny footprint since you don't really provide any arguments to refute. Plus it can be learned as a skill and applied quickly on a broad range of topics. (Does it have a name?)
It's honestly too good to give up :)) But I don't know if it can be used responsibly. Keeping a milder zinger and adding the arguments as well would be an good start, probably.
I kind of feel like the COVID posts have run their course and have a lot more dunking on the usual suspects than they used to. Other posts have less sneering on average it seems like.
Every time somebody uses the term "conspiracy theory" as a despective term, they are wrong. Conspiracies do exist, and lately the "conspiracy theorists" have turned out more right than wrong. So all people who are hasty to label them are achieving is showing their own bias against certain kinds of truths.
The important question should always be "is this true?", not "is this a conspiracy theory?".
How do you demarcate a true conspiracy theory from a false conspiracy theory? I'll tell you how *not* to do it: trusting experts over your own judgement.
This is a completely coward heuristic. You can't say that argument from authority is a fallacy... except when the conclusion is true. You are supposed to first determine the validity of the argument in order to consider the truth of the conclusion, not use the likelihood of the conclusion to determine the validity of the argument.
I don't understand how "follow the rules of rationality... except when the conclusion is yucky" is supposed to be an intelligent guideline.
I explore how "skeptics" do not properly evaluate conspiracy theories in this substack article:
> I don't understand how "follow the rules of rationality... except when the conclusion is yucky" is supposed to be an intelligent guideline.
I suppose the charitable reading is as a reminder that your intuition, while broken in many ways, _is_ (weak) Bayesian evidence. If you can't figure out what's wrong with the argument but your intuition goes off AND you also can't figure out why (i.e. it's not reducible to some well-known bias), then don't forget to factor in that evidence, however imprecise it is.
But evidence isn't proof. Something shouldn't be believed just because there's evidence for it. And I don't subscribe to Bayesian epistemology. Believing something for bad reasons is wrong, even if it's just a little bit.
"Something shouldn't be believed just because there's evidence for it" is a very Bayesian thing to say :) Because there's normally evidence on both sides.
"Proof" is unattainable beyond math, even "proof beyond reasonable doubt" is unattainable in most spheres.
If you believe something isn't true because it seems like a conspiracy theory, and your intuition tells you that conspiracy theories aren't true, that's a *bad* reason.
Alternatively, it's implicit outside view of "theories of such shape are more often wrong than not because they're easy to generate", which is a *good* reason to lower confidence.
An odd thing about communicating evidence, is that if I see a werewolf, who howls at the moon, flips me the bird, and scampers off into the woods, and I tell you about it, then the bayesian evidence that I have for P(werewolves exist) is seeing [what appears to be] a werewolf, hearing it howl at the moon, and watching it flip me the bird & scamper off into the woods, but the bayesian evidence you have for P(werewolves exist) is just "a guy told me a story about seeing a werewolf".
Afterwards, it would be rational for you to mainly update your posterior of P("dang, people say crazy things sometimes"), while for me it's now rational to buy silver & bullet molds.
I was getting ready to cheer on this post until I saw that you reported on private conversations we had, and in which I engaged in the spirit of friendly conversation, in such a way as to smear me in a way I cannot defend other than by publishing said conversations. I am updating in the direction that when engaging with people of this "community" in the spirit of cooperation, I will continue to be defected upon until I learn my lesson.
A sad state of affairs that engaging in critique of an objectively error-riddled argument, and raising significant issues, gets me labeled a conspiracy theorist. And even the people who push back on the advice to free ride on humanity feel the obligation to reassert that I am a conspiracy theorist based on private conversation.
I don't know what happened to the rationality community I used to know, but sadly my conclusion is that engaging in private was an error.
I am genuinely sorry that you feel I betrayed your trust here by summarizing my view of the conversations we had. My statements here were not violating the privacy norms as I understand them, as I believed you would have happily (and would happily now) made all the same points in public.
I did not mean to imply you are a conspiracy theorist - I have edited to explicitly say I am NOT claiming that, to avoid the implication (I changed the title post at the very end to make it more long-term and not refer to the original post, which may have created an implication I did not intend, so better to be clear and avoid that.)
I appreciate the response and believe you are honestly reporting on your perspective.
The simple problem I'm faced with is that you create a "he said she said" situation for me. I like to address things with specifics. I don't know which claims of mine you objected to, and frankly, if you didn't want to have the conversation, I respect that.
You write that I "talked to you" as if you were a passive object of my talking, but I don't recall answering questions I was not asked. Sadly I am locked out of my twitter account and cannot review the logs of our conversation.
And I assume with this you are faced with the question of whether to detail the issues, which you doubtlessly do not want to spend your time on. I get that. What I do not get is... what happened to simply disagreeing on stuff? What happened to making arguments and letting everyone make up their own mind? Why does everything have to be turned into camps and politics and discrediting people?
I've been observing this deranged conversation where the upshot of the fact that i chose to spend my own time documenting dozens of errors in what is likely Scott's most influential piece [1] raises the question of whether anyone should even engage with me, with nobody actually claiming i was substantially wrong on my claims. I didnt expect being thanked. But i did expect being treated fairly or at least being granted the dignity of being ignored. What i get instead is this trend where I get turned into the symbol of a dumb person **for being right**. It is just too much.
And maybe you're unfairly getting the brunt of my anger for being the straw that broke the camel's back, but please appreciate the fact that when honest and credible criticism is being met with such backlash, the community will get less and less of it, fragilizing the validity of its claims.
And because I recognize and respect that you don't have the time to get to the bottom of our disagreement, and since i dont want vague insinuations about what i said or didnt to be floating around on the internet, sadly the best strategy would have been never to talk to you in private in the first place.
C'est la vie. We live and learn. I really do appreciate your response and don't expect another.
I was impressed with Chatgpts summary but would still recommend the book. Imo it's one of the most elegant essays on human knowledge and enquiry ever, on the level of Aristotle and Kant. It can help anyone think more clearly and theres a great deal to be gained by absorbing a great thinker in action...
Upon some thought, I am thinking of publishing the logs of our conversation relating to pfizer, vaccines, etc. I was able to get access to it, and I think it was a great conversation worth putting out in public. Do you have any objections? I would like to document what I said, why I said it, and let readers make up their own mind.
'So after spending several hours on this, I concluded continued engagement was not a good use of my time, and I stopped doing it. I think that was a good decision process.'
Ive watched Alexandros review of your conversation. It seemed in good spirit; although it seemed you pulled back a bit once he informed you (by means of authoritative sources) of the rapidly waning vaccine protection; something that was apparently not common knowledge then as it is now; but which appeared to very much at odds with your world view at the time.
It would have been nice of you to credit Alexandros with bringing you that insight. Even nicer if you specified what he said about Pfizer or vaccines that made you feel like you should disengage.
The way it stands, it would appear that back then Alexandros made you uncomfortable by challenging your worldview with correct information about rapidly waning vaccine effectiveness; and all you remembered from that exchange is the bad taste this challenge to your worldview left in your mouth, and you did not actually check if the specific of your exchange had aged well, one way or the other.
I don't really disagree but let me say this: I spent like 30 years on the theory that if folks made good, rational decisions my ideas would work and if they made bad ones it was on them. It has never, ever, produced good results for anyone. Some acknowledgement of the machinations of Moloch need to be made for any large social problem and at this point I'd consider "people believing false things because they don't like the people who believe the true things" to be a large social problem.
What were some of the more interesting or promising ideas?
Unfortunately, nothing fun or unique to me, mostly boring libertarian worldview stuff.
Okay. I just wish comments had more substance. Like walk through an actual example to demonstrate the point. Maybe you're dropping wisdom? It's hard to tell...
I'm trying to break myself of the habit of writing a book as a comment. But, for example, it's very easy to make a claim like "if a subscription model social media network provided more value than an advertising model, then folks would pay for subscriptions. They don't, so clearly either the downsides of an advertising model are less destructive than the cost of a subscription or folks are making poor content choices. Either way, it's the individual's responsibility to make the best choices for themselves, so larger institutions shouldn't intervene in that choice."
But of course the preference for advertisement isn't built individually. The whole point of social media is to be networked with people you want to be networked with. And so you may decide to join an advertisement-based network for those reasons. Or you may join because you're overvaluing short-term loss aversion when you'd actually be happier with a subscription model in the long-run. At any rate, it doesn't matter from a consequentialist point of view because the fact is that people are choosing an advertisement-based model to their own detriment, and that's hurting even folks who are acting rationally, because it's building a culture that is obsessed with attention-capture at the expense of actual value.
To bring it back to this post, I absolutely agree that there are massive downsides to telling people to accept official narratives in lieu of their own judgment. Except that most people are awful at reasoning. Zvi and many other rationalist-bent folks tend to respond to that by saying "well then people should be better at reasoning and if not it's on them." I understand that point of view incredibly well but the fact is that "make people better at reasoning about every single issue that might pop up on a facebook news feed" is an incredibly difficult, borderline impossible goal. Meanwhile, conspiracy theories abound, because people's poor skill at reasoning and emotional reaction to content causes them to want to embrace untrue narratives. And saying "Well on their own heads be it" ignores the fact that this forces us to live in a society where stakeholders believe nonsense and make choices that affect all of us based on that nonsense. It's a consequentialist disaster.
It's possible (I think probable) that it would be worse or exacerbate the problem to push a "trust experts" narrative to counter this, but it does need to be countered. All I can say for sure is "do your own research" is not actually successful for most people, who suck at contextualizing that research and putting it into perspective.
> All I can say for sure is "do your own research" is not actually successful for most people, who suck at contextualizing that research and putting it into perspective.
To further complicate that, pretty much nobody is actually doing first-order research. Professionals are doing the research and the "skilled-rationalists" like Scott and doing a meta-analysis, and the less skilled are doing some kind of tribal popularity contests in determining which experts to believe. So in practice everyone _already_ is listening to _some_ form of expert and the problem is actually choosing amongst competing opinion. That make both the advice of "listen to experts" or "do your own research" somewhat intractable. We aren't actually crowdfunding double-blind studies or whatever.
Great post.
One of the things that I've noticed when digging into a few conspiracy theories is that they "rhyme", semantically. They all seem to use the same tricks:
- "you can trust me, why would I lie to you?"
- "this expert was wrong in this one particular place, so nothing she says can be trusted"
- "I have a single interesting argument backed by facts, so you should believe everything else I say"
- "if you accept the conventional wisdom, you're a sheep. You don't want to be a sheep, do you?"
- "Here's a fact. Allow me to extrapolate from that fact to the nth degree"
If I were emperor, I would explore the idea of having students earning advanced degrees have to demonstrate their critical thinking skills by debunking a conspiracy theory. Not because the theory needs debunking, necessarily, but because the process of dissecting those arguments will set that student up to recognize the "rhyme and meter" of other conspiracy theories
It's good to draw a distinction between the *case for* a conspiracy theory, the way it is presented, and the theory itself. Certainly when you see someone engaging in the CS-grift-hype-snow cycle, or whatever you would best call it - the thing you call rhyme and meter - you should downgrade the source accordingly.
Very true. I've seen so many of the grift-based appeals that I'd forgotten "not-grift-based" appeals still exist
To me, this sounds a lot like what most people say regarding more standard theories, to the point that a regular layperson may have trouble distinguishing between the Standard theory and the Conspiracy theory. Even big name experts often say similar things, like change your second line to "this other expert..." and the fourth to "if you accept [other theory] you're a rube..."
"Expert" is not a real category in many disputes. It's also extremely fuzzy when multiple experts disagree with each other, as is very often the case. How do you say that ivermectin doesn't work when hundreds (thousands?) of doctors and scientists are trying it with active patients and several countries have added it to their official treatment advice? Personally, I say that you *don't take a stance* on that question because it's still open. Even if you're a doctor working actively on testing it, you say something like "early results indicate..." Right now, my feeling on ivermectin are "pretty unlikely to work, maybe for certain people in certain situations that we don't fully understand." There's not much harm to be found there, either way. If I was prescribed it by my doctor, I would be willing to try it (as with most things he might prescribe to treat what I went to him to fix), but wouldn't pressure him to prescribe it.
Trying to do otherwise becomes an exercise in identifying who the "real" experts are, which unsurprisingly aligns almost perfectly with which experts say the things we already think are true.
Bam.
It feels like this post might not have digested David Deutsch's The Beginning of Infinity or at least it's Popperian sections sufficiently.... It's as if there is an assumption that we can do better than dialogue and critique . Deutsch would disagree and it's hard to argue with him once the case is laid out
I have a friend who holds several conspiracy theories and he always asks for certainty from the positions that oppose him, and believes he has exposed them when they can't provide it. So he believes the moon landings were faked and just because the scenes could have been filmed in a studio seems to him proof that they were. Isn't the beginning of rationality an agreement to accept the same standards of evidence on both sides of an argument?
It would be good to lay out which angle or paragraph of the post offends Deutsch, the nature of the offense, and the proposed remedy. What is beyond dialogue and critique that is embraced, incorrectly, here?
> Isn't the beginning of rationality an agreement to accept the same standards of evidence on both sides of an argument?
Who are you questioning here? This post argues for symmetric standards of evidence:
> I was given the same test, here. In addition to Scott Alexander, Alexandros talked extensively to me. His initial claims were thoughtful and interesting, and I engaged. It became quickly clear that he was applying asymmetric standards to evidence.
I did not read Deutsch. I could be persuaded to with a high enough combined bid from people I trust, seems like a reasonable thing to do.
Perhaps this is, as came up in another comment, the difference between a theory and the typical method of arguing for that theory? If someone says 'if you don't prove X then that proves ~X' or things like that, it is very hard to seek truth usefully with them... but also that only sometimes is strong evidence for X.
And the opposing view seems to be to dismiss arguments and evidence from the wrong sources, and to instead defer your beliefs elsewhere, which doesn't sound to me like symmetrical standards of evidence in any meaningful way, either.
This is what chatbot says on the book...
David Deutsch's book "The Beginning of Infinity" explores Karl Popper's views on epistemology, the acquisition of knowledge, and the limits of truth. Here's a summary of Popper's ideas:
Epistemology: Popper believed that knowledge is not acquired through observation alone, but rather through the process of conjecture and refutation. He argued that scientific knowledge is always provisional and can never be completely certain.
The acquisition of knowledge: Popper argued that scientific theories can never be proven to be true, but they can be falsified. The process of scientific discovery involves formulating hypotheses, testing them, and then attempting to falsify them. This process leads to the development of new knowledge and the rejection of old, inadequate theories.
The limits of truth: Popper believed that there is no such thing as absolute truth or knowledge. Instead, our understanding of the world is always limited by our perspectives, our methods of inquiry, and our ability to formulate testable hypotheses. He argued that science is an ongoing process of approximation towards the truth, and that we can never be certain that we have arrived at the final truth.
Overall, Popper's views emphasize the importance of critical thinking, empirical testing, and the recognition of our own limitations in the pursuit of knowledge. His ideas have had a significant impact on the philosophy of science, and continue to be influential today.
Gotcha. I assume from your posting it that you endorse this summary. Sounds like there isn't anything here that I disagree with or don't already know, so no need to read the book at this time.
Well said. just to add: reading Scott's 25k words about Iv. was a lot of FUN, because: Scott, obviously. Interesting, too - maybe enlightening even, dunno. See, I have a Masters in FLT (German as a foreign language) and minors in History/Theology. And a patreon of "acoup". - None of that matters. As in: really matters, pays my bills or whatever*. But it tickles my brain. What else life is for? - 98% of Scott's "more-than-you-want-to-know"-readers are there for refined infotainment, I guess.
(*it gave me needed credentials. Early on I suspected FLT is no science. Nowadays I know on top of that: at college it is not even a craft.)
- Otoh: Scott should have ignored Kavanaugh, maybe. But then: epistemology, important kinda. -
Takeaway from covid: The biggest issue was never whether there were good/reasonable reasons to mask and to lockdown, but whether the downsides were given enough thought (and vice-versa). And the authorities for one aspect were usu. utterly ignorant about the others. Thus: No one was "expert" enough. Well, overall you became one of the best. (I just don't know the others)
Just to clarify I'm not putting words in Deutsch's mouth. I'm saying that if one absorbed the essence of his book the original piece might have been conceived and framed differently. I will try to point out some of the more obvious sentences but the profound differences are in worldview or background assumptions more than specific points
“If we tell young people to (almost) always trust the American [X] Association on X, and journalists about the news, dismiss anything that the authorities call a conspiracy theory, and never get any practice thinking for themselves on such matters, we deserve what we get.”
Yes, and the sad thing is, I think it (used to be, decades ago) more or less true that if you were not interested in / not able to look into complex topics yourself, you would more often than not get a good result by trusting the American X Association on X and journalists about the news. (Although occasionally things labeled conspiracy theories have turned out to be true. And it’s always been beneficial to use your brain and think for yourself when something doesn’t ring true.)
In general “trusting the experts” was a useful rule of thumb for the many, many things that we don’t have time to research for ourselves.
That is no longer true -- _really_ no longer true -- on many issues of importance (for instance when I hear such cheery reports on the results of EPA testing in East Palestine OH, I simply don’t believe it).
I don’t think it’s melodramatic to say this deserved loss of trust portends the unraveling of our society and culture.
If the only way you believe you can get accurate information is to look into every issue yourself -- well, ain’t nobody got time for that (not for everything that matters).
And so we end up being in a state of ignorance, knowing we’re in a state of ignorance (do I know anything about the air and water in East Palestine, Ohio? No) and unable to make good decisions. Perhaps the default decision becomes “freeze and do nothing,” because you don’t know whom to trust-- or perhaps you perceive “each ‘side’ has its own agenda and therefore none of them is to be trusted” -- and I believe that’s what happened to a lot of people with covid, especially in the US.
This yielded very bad results, if you measure US outcomes against the rest of the world’s.
I don’t know how or whether it’s possible to repair the damage. I doubt it is. We have a ruthless plundering profit-seeking class, and they will carry on until there’s nothing left to be gained. In the meantime, the truth of any matter be damned.
A couple comments:
(1) First, do we all agree that the idea that Ivermectin might be effective against Covid is a CONSPIRACY THEORY? My impression from the Scott-Alexandros exchange was that it was a live scientific debate, but that Scott and most US medical experts think it's likely not to be significantly effective.
(2) Second, I'll offer myself as a test case reader.
(a) Scott waded into this area, conceded that there are a lot of studies indicating that Ivermectin might be effective, rejected many but not all of those studies, and then proposed a novel theory that Ivermectin shows effectiveness in the developing world because that population is more likely to also be infected with parasites.
(b) Then Alexandros made a bunch of very detailed criticisms of Scott's analysis, and Scott ultimately conceded that while he didn't agree with all of Alexandros's criticisms, he had unfairly dismissed some studies, and that the case for parasitic co-infection as an explanation for Ivermectin's results in developing world Covid studies is not accepted by at least one parasitic infection expert.
(c) As a result, I've updated to "Ivermectin is probably still not significantly effective against Covid, but the case for effectiveness is much stronger than I had been told." If we were still living in a pre-vaccine, pre-Paxlovid world, I'd probably stock up on Ivermectin just in case my family caught Covid, on the theory that it probably won't help, but that in appropriate doses, it's more likely to help than harm.
Now, if (c) is bad, then I guess Scott's interaction in this debate is bad. I'm not sure if it is or isn't.
It's probably my error, but by getting into more detail, and then conceding he was wrong on some important points, Scott raised my general assessment of his personal credibility, but also reduced my assessment that he's correct to discount Ivermectin in this specific case.
By contrast, when Zvi and Scott disagreed on the likely effectiveness of Vitamin D, my guesstimate was that Scott was closer to correct, in part specifically because he kept his argument at a simple, difficult to disprove level like "Vitamin D is often associated with positive health outcomes, but almost always disproved at the end."
I applaud you for seemingly being one of the very few people who has been able to keep track of this conversation.
Thanks to you and Scott for doing it. I'm most interested in the question of how we should form beliefs, so reading two people engaging in a subject that is over my head was helpful. :) (You can see that I'm engaging with the debate at a pretty removed level, and not trying to solve who's right where you and Scott disagree.)
I don't think Scott or Zvi think that the case for Ivermectin effectiveness against Covid is a conspiracy theory. Instead, I think they moved on to the topic of "should we engage with conspiracy theories," but they're not very precise about that.
Given that they are "moving on" after Scott's engagement with me, on the basis of CK's tweet, which was about me, I have a hard time seeing how the context doesn't apply to me. Besides, Scott has been painting me in that light for a while [1]. And even if somehow these spectacularly capable communicators don't "get" that that is the result of what they write, it ultimately matters not. The message that is being sent is that pushing back on the top rationalist sensemakers will get you branded as a conspiracy theorist.
I, like you, am mostly interested in questions of social epistemology. Much of my motivation was that disagreeing with Scott was a very strange situation for me, given the great respect I had for him, so the only reasonable thing to do was to dive in and figure it out. Seeing the result, I'm deeply concerned as to why people don't seem to care about sending a message that criticism isn't welcome, and the fragilization of the community that will bring. Sadly, one of the explanations, and the one I find most plausible, is that in the decade I stopped paying attention to the rationalist community, that kind of move has been played many times over, and it no longer raises an eyebrow.
[1]: https://astralcodexten.substack.com/p/bounded-distrust
A pattern that's taken me many many years to recognize as toxic: putting down the other side instead of offering any attempt of argumentation. It sounds brutally obvious when put like this, but unfortunately even Zvi is doing it quite frequently. I tend to swallow it from Zvi because:
- he's more than compensating by offering good analysis in other instances
- he tends to be right
- time is a limited resource, and some of his posts would have to be twice as long to avoid this.
It's not a great thing, though. But imagine whole books written with nothing but sneer. They're out there, and it's a god damn annoyingly hard thing to recognize as wrong. I think it has something to do with our social brain - if the author manages to suggest that the people claiming X are inferior, you have a hard time accepting X may be correct and the author just full of shit. It would mean that you risk putting yourself in the same class as the inferior people, and, perhaps unsurprisingly, your brain will strongly resist thinking that.
I am sad that I do this as much as I do, and would like to do it less. The inside perspective on what is going on when this happens is that I am assuming my readers already know the arguments necessary, or I've already given them elsewhere and am taking them as given, or at least I think this is the case - I like to think I am assuming facts previously introduced into evidence. And you have to do at least some of that structure building, if you want to accomplish anything.
If I'm doing this type of thing in unreasonable fashion, by all means call me out on it when you see it.
And yeah, those books exist, oh boy do they exist.
Like Dave, I also noticed this in the early Covid posts. In some cases it wasn't even very clear what I should believe, only that the tone indicated it was an obvious answer. Those were weird times indeed - a few days/hours difference or reading a certain blog could give you a whole lot more knowledge on a topic than weeks of thinking about it - so yeah, it was quite possible to have huge knowledge asymmetries with readers, and the inside perspective your describe fails.
The main problem with that is that's... very effective. My brain was perfectly capable to notice the lack of arguments, and it did, and still went into full "I gotta find out what zvi believes so I know what to believe" mode. It quite probably even got away with it a few times.
I guess it wouldn't have lasted as long is it wasn't a great rhetorical device. Touches the social brain, makes the reader feel special, has built-in protection from the opposition by painting them as low status and, best of all, gives you an absolutely tiny footprint since you don't really provide any arguments to refute. Plus it can be learned as a skill and applied quickly on a broad range of topics. (Does it have a name?)
It's honestly too good to give up :)) But I don't know if it can be used responsibly. Keeping a milder zinger and adding the arguments as well would be an good start, probably.
I kind of feel like the COVID posts have run their course and have a lot more dunking on the usual suspects than they used to. Other posts have less sneering on average it seems like.
Every time somebody uses the term "conspiracy theory" as a despective term, they are wrong. Conspiracies do exist, and lately the "conspiracy theorists" have turned out more right than wrong. So all people who are hasty to label them are achieving is showing their own bias against certain kinds of truths.
The important question should always be "is this true?", not "is this a conspiracy theory?".
How do you demarcate a true conspiracy theory from a false conspiracy theory? I'll tell you how *not* to do it: trusting experts over your own judgement.
This is a completely coward heuristic. You can't say that argument from authority is a fallacy... except when the conclusion is true. You are supposed to first determine the validity of the argument in order to consider the truth of the conclusion, not use the likelihood of the conclusion to determine the validity of the argument.
I don't understand how "follow the rules of rationality... except when the conclusion is yucky" is supposed to be an intelligent guideline.
I explore how "skeptics" do not properly evaluate conspiracy theories in this substack article:
https://felipec.substack.com/p/dismantling-pseudo-skepticism
> I don't understand how "follow the rules of rationality... except when the conclusion is yucky" is supposed to be an intelligent guideline.
I suppose the charitable reading is as a reminder that your intuition, while broken in many ways, _is_ (weak) Bayesian evidence. If you can't figure out what's wrong with the argument but your intuition goes off AND you also can't figure out why (i.e. it's not reducible to some well-known bias), then don't forget to factor in that evidence, however imprecise it is.
But evidence isn't proof. Something shouldn't be believed just because there's evidence for it. And I don't subscribe to Bayesian epistemology. Believing something for bad reasons is wrong, even if it's just a little bit.
"Something shouldn't be believed just because there's evidence for it" is a very Bayesian thing to say :) Because there's normally evidence on both sides.
"Proof" is unattainable beyond math, even "proof beyond reasonable doubt" is unattainable in most spheres.
As for "bad reasons" - define those.
If you believe something isn't true because it seems like a conspiracy theory, and your intuition tells you that conspiracy theories aren't true, that's a *bad* reason.
Alternatively, it's implicit outside view of "theories of such shape are more often wrong than not because they're easy to generate", which is a *good* reason to lower confidence.
An odd thing about communicating evidence, is that if I see a werewolf, who howls at the moon, flips me the bird, and scampers off into the woods, and I tell you about it, then the bayesian evidence that I have for P(werewolves exist) is seeing [what appears to be] a werewolf, hearing it howl at the moon, and watching it flip me the bird & scamper off into the woods, but the bayesian evidence you have for P(werewolves exist) is just "a guy told me a story about seeing a werewolf".
Afterwards, it would be rational for you to mainly update your posterior of P("dang, people say crazy things sometimes"), while for me it's now rational to buy silver & bullet molds.
I was getting ready to cheer on this post until I saw that you reported on private conversations we had, and in which I engaged in the spirit of friendly conversation, in such a way as to smear me in a way I cannot defend other than by publishing said conversations. I am updating in the direction that when engaging with people of this "community" in the spirit of cooperation, I will continue to be defected upon until I learn my lesson.
A sad state of affairs that engaging in critique of an objectively error-riddled argument, and raising significant issues, gets me labeled a conspiracy theorist. And even the people who push back on the advice to free ride on humanity feel the obligation to reassert that I am a conspiracy theorist based on private conversation.
I don't know what happened to the rationality community I used to know, but sadly my conclusion is that engaging in private was an error.
I am genuinely sorry that you feel I betrayed your trust here by summarizing my view of the conversations we had. My statements here were not violating the privacy norms as I understand them, as I believed you would have happily (and would happily now) made all the same points in public.
I did not mean to imply you are a conspiracy theorist - I have edited to explicitly say I am NOT claiming that, to avoid the implication (I changed the title post at the very end to make it more long-term and not refer to the original post, which may have created an implication I did not intend, so better to be clear and avoid that.)
I appreciate the response and believe you are honestly reporting on your perspective.
The simple problem I'm faced with is that you create a "he said she said" situation for me. I like to address things with specifics. I don't know which claims of mine you objected to, and frankly, if you didn't want to have the conversation, I respect that.
You write that I "talked to you" as if you were a passive object of my talking, but I don't recall answering questions I was not asked. Sadly I am locked out of my twitter account and cannot review the logs of our conversation.
And I assume with this you are faced with the question of whether to detail the issues, which you doubtlessly do not want to spend your time on. I get that. What I do not get is... what happened to simply disagreeing on stuff? What happened to making arguments and letting everyone make up their own mind? Why does everything have to be turned into camps and politics and discrediting people?
I've been observing this deranged conversation where the upshot of the fact that i chose to spend my own time documenting dozens of errors in what is likely Scott's most influential piece [1] raises the question of whether anyone should even engage with me, with nobody actually claiming i was substantially wrong on my claims. I didnt expect being thanked. But i did expect being treated fairly or at least being granted the dignity of being ignored. What i get instead is this trend where I get turned into the symbol of a dumb person **for being right**. It is just too much.
And maybe you're unfairly getting the brunt of my anger for being the straw that broke the camel's back, but please appreciate the fact that when honest and credible criticism is being met with such backlash, the community will get less and less of it, fragilizing the validity of its claims.
And because I recognize and respect that you don't have the time to get to the bottom of our disagreement, and since i dont want vague insinuations about what i said or didnt to be floating around on the internet, sadly the best strategy would have been never to talk to you in private in the first place.
C'est la vie. We live and learn. I really do appreciate your response and don't expect another.
[1]: https://docs.google.com/spreadsheets/d/1CIiDk07Mw6v1LsrIX6H5ragw2c8lgfOchGKQq9_5i4k/edit?usp=drivesdk
I was impressed with Chatgpts summary but would still recommend the book. Imo it's one of the most elegant essays on human knowledge and enquiry ever, on the level of Aristotle and Kant. It can help anyone think more clearly and theres a great deal to be gained by absorbing a great thinker in action...
Kavanagh was gatekeeping in that smug way that drives even someone like me nuts.
Hi Zvi,
Upon some thought, I am thinking of publishing the logs of our conversation relating to pfizer, vaccines, etc. I was able to get access to it, and I think it was a great conversation worth putting out in public. Do you have any objections? I would like to document what I said, why I said it, and let readers make up their own mind.
I think that is entirely fair, if you wish to do that. Let freedom ring.
'So after spending several hours on this, I concluded continued engagement was not a good use of my time, and I stopped doing it. I think that was a good decision process.'
Ive watched Alexandros review of your conversation. It seemed in good spirit; although it seemed you pulled back a bit once he informed you (by means of authoritative sources) of the rapidly waning vaccine protection; something that was apparently not common knowledge then as it is now; but which appeared to very much at odds with your world view at the time.
It would have been nice of you to credit Alexandros with bringing you that insight. Even nicer if you specified what he said about Pfizer or vaccines that made you feel like you should disengage.
The way it stands, it would appear that back then Alexandros made you uncomfortable by challenging your worldview with correct information about rapidly waning vaccine effectiveness; and all you remembered from that exchange is the bad taste this challenge to your worldview left in your mouth, and you did not actually check if the specific of your exchange had aged well, one way or the other.