Link to the EA version of this post.
EA doesn’t have strong norms against misquoting or some other types of errors related to having high intellectual standards (which I claim are important to truth seeking). As I explained, misquoting is especially bad: “Misquoting puts words in someone else’s mouth without their consent. It takes away their choice of what words to say or not say, just like deadnaming takes away their choice of what name to use.”
Despite linking to lizka clarifying the lack of anti-misquoting norms, I got this feedback on my anti-misquoting article:
One of your post spent 22 minutes to say that people shouldn't misquote. It's a rather obvious conclusion that can be exposed in 3 minutes top. I think some people read that as a rant.
So let me try to explain that EA really doesn’t have strong anti-misquoting norms or strong norms for high intellectual standards and scholarship quality. What would such norms look like?
Suppose I posted a single misquote in Rationalization: AI to Zombies. Suppose it was one word added or omitted and it didn’t change the meaning much. Would people care? I doubt it. How many people would want to check other quotes in the book for errors? Few, maybe zero. How many would want to post mortem the cause of the error? Few, maybe zero. So there is no strong norm against misquotes. Am I wrong? Does anyone really think that finding a single misquote in a book this community likes would result in people making large updates to their views (even is the misquote is merely inaccurate, but doesn’t involve a large change in meaning)?
Similarly, I’m confident that there’s no strong norm against incorrect citations. E.g. suppose in RAZ I found one cite to a study with terrible methodology or glaring factual errors. Or suppose I found one cite to a study that says something different than what it’s cited for (e.g. it’s cited as saying 60% X but the study itself actually says 45% X). I don’t think anything significant would change based on pointing out that one cite error. RAZ’s reputation would not go down substantially. There’d be no major investigation into what process created this error and what other errors the same process would create. It probably wouldn’t even spark debates. It certainly wouldn’t result in a community letter to EY, signed by thousands of people with over a million total karma, asking for an explanation. The community simply tolerates such things. This is an example of intellectual standards I consider too low and believe are lowering EA’s effectiveness a large amount.
Even most of RAZ’s biggest fans don’t really expect the book to be correct. They only expect it to be mostly correct. If I find an error, and they agree it’s an error, they’ll still think it’s a great book. Their fandom is immune to correction via pointing out one error.
(Just deciding “RAZ sucks” due to one error would be wrong too. The right reaction is more complicated and nuanced. For some information on the topic, see my Resolving Conflicting Ideas, which links to other articles including We Can Always Act on Non-Criticized Ideas.)
What about two errors? I don’t think that would work either. What about three error? Four? Five? Nah. What exactly would work?
What about 500 errors? If they’re all basically indisputable, then I’ll be called picky and pedantic, and people will doubt that other books would stand up to a similar level of scrutiny either, and people will say that the major conclusions are still valid.
If the 500 errors include more substantive claims that challenge the book’s themes and concepts, then they’ll be more debatable than factual errors, misquotes, wrong cites, simple, localized logic errors, grammar errors, etc. So that won’t work either. People will disagree with my criticism. And then they won’t debate their disagreement persistently and productively until we reach a conclusion. Some people won’t say anything at all. Others will comment 1-5 times expressing their disagreement. Maybe a handful of people will discuss more, and maybe even change their minds, but the community in general won’t change their minds just because a few people did.
There are errors that people will agree are in fact errors, but will dismiss as unimportant. And there are errors which people will deny are errors. So what would actually change many people’s minds?
Becoming a high status, influential thought leader might work. But social climbing is a very different process than truth seeking.
If people liked me (or whoever the critic was) and liked some alternative I was offering, they’d be more willing to change their minds. Anyone who wanted to say “Yeah, Critical Fallibilism is great. RAZ is outdated and flawed.” would be receptive to the errors I pointed out. People with the right biases or agendas would like the criticisms because the criticisms help them with their goals. Other people would interpret the criticism as fighting against their goals, not helping – e.g. AI alignment researchers basing a lot of their work on premises from RAZ would tend to be hostile to the criticism instead of grateful for the opportunity to stop using incorrect premises and thereby wasting their careers.
I’m confident that I could look through RAZ and find an error. If I thought it’d actually be useful, I’d do that. I did recently find two errors in a different book favored by the LW and EA communities (and I wasn’t actually looking for errors, so I expect there are many others – actually there were some other errors I noticed but those were more debatable). The first error I found was a misquote. I consider it basically inexcusable. It’s from a blog post, so it would be copy/pasted not typed in, so why would there be any changes? That’s a clear-cut error which is really hard to deny is an error. I found a second related error which is worse but requires more skill and judgment to evaluate. The book has a bunch of statements summarizing some events and issues. The misquote is about that stuff. And, setting aside the misquote, the summary is wrong too. It gives an inaccurate portrayal of what happened. It’s biased. The misquote error is minor in some sense: it’s not particularly misleading. The misleading, biased summary of events is actually significantly wrong and misleading.
I can imagine writing two different posts about it. One tries to point out how the summary is misleading in a point-by-point way breaking it down into small, simple points that are hard to deny. This post would use quotes from the book, quotes from the source material, and point out specific discrepancies. I think people would find this dry and pedantic, and not care much.
In my other hypothetical post, I would emphasize how wrong and misleading what the book says is. I’d focus more on the error being important. I’d make less clear-cut claims so I’d be met with more denials.
So I don’t see what would actually work well.
That’s why I haven’t posted about the book’s problems previously and haven’t named the guilty book here. RAZ is not the book I found these errors in. I used a different example on purpose (and, on the whole, I like RAZ, so it’s easier for me avoid a conflict with people who like it). I don’t want to name the book without a good plan for how to make my complaints/criticisms productive, because attacking something that people like, without an achievable, productive purpose, will just pointlessly alienate people.