Well-Kept Gardens Die By Pacifism by Eliezer Yudkowsky:
Good online communities die primarily by refusing to defend themselves.
Somewhere in the vastness of the Internet, it is happening even now. It was once a well-kept garden of intelligent discussion, where knowledgeable and interested folk came, attracted by the high quality of speech they saw ongoing. But into this garden comes a fool, and the level of discussion drops a little—or more than a little, if the fool is very prolific in their posting. (It is worse if the fool is just articulate enough that the former inhabitants of the garden feel obliged to respond, and correct misapprehensions—for then the fool dominates conversations.)
And what if you’re wrong about who the fools are?
What if you’re biased?
Where are the error correction mechanisms? Where are the objective tests to check that you aren’t screwing up?
Where’s the transparency? The rule of law, not rule of man? Where are the checks and balances?
If all you’ve got (as the moderator) is “I trust my judgment” then you’re just like everyone else, including the fool.
If you add some methods to try to have good judgment and try not to be biased … that’s not enough. Trying really hard, but ultimately trusting yourself and just going by what makes sense to you … is not a good enough defense against bias. Bias is powerful enough to beat that.
If fundamentally you’re just assuming your biases are pretty small and manageable, you are a fool. We are all alike in our infinite ignorance. We’re at the beginning of infinity. We’re so lost and confused in so many ways. We have big errors. Really big errors. Some are society-wide. And when we suppress everything that seems really quite wrong to us, what we’re doing is suppressing outliers. Yeah negative outliers are more common than positive outliers. It’s easier to be wrong than right. But how do you avoid suppressing positive outliers?
There are mechanisms you can put in place to make it harder to act on whim, bias, irrationality, etc.
E.g. you can explain why you take moderator actions. And you can field questions about your actions and reply to criticism. You could reply to all such things, plus followups. If you won’t do that, that’s a cutoff where you’re blocking error correction. And what have you done to prevent this cutoff from protecting the biases you may have?
Yes defense mechanisms are needed. But why can’t they be efficient and reusable ways to address the arguments of everyone, including fools? And a resilient forum where people think for themselves about what to focus attention on. If people want curation, fine, no problem, post curated selections somewhere – and leave the open forum also in existence. Post weekly favorites or whatever for the readers who don’t want to find the good stuff themselves. The curated view on the forum doesn’t have to be the only view. You can have an open public space, and a walled garden, both. Or dozens of competing walled gardens with different curators (thought only a few would probably have the vast majority of the popularity). But that’s dangerous. The curators may be biased. They may curate mostly by social status, for example. They may not know they do that. They may not understand some of their own systematic biases.
You have to fundamentally stop assuming you’re right or probably right and take seriously that you need error correction mechanisms to keep yourself honest. You can’t trust your own integrity. Don’t bet your life or your forum on your own integrity.
Scientists don’t bet science on their own integrity. Integrity helps. Try to have it. But science isn’t like “ok well do your best with the experiment and if you have integrity it should work out ok”. Instead experiments are designed to work out ok even if the experimenters can’t be trusted. The mechanisms don’t have unlimited resilience. Egregious scientific fraud has to get caught by outsiders. The scientific method makes that easier to catch. It’s harder to fake your experiments when there are procedures to follow, documentation to provide, etc. Most people are trying to cheat without full conscious intention though, so that’s easier to deal with. Having a little bit of integrity is really helpful compared to none. And anti-bias mechanisms with transparency and stuff do put a leash on the bad faith cheaters.
Double blind makes it harder to cheat even if you want to. You can be really biased, and your bias can control you, but if you follow the rules of double blind then it’s much harder for you to bias the results.
Control groups don’t care if you’re biased. That’s a system which is hard to cheat without extreme blatantness like skipping the control group entirely and using fabricated data. And it’s hard to do that because of transparency. And even if you get away with it, your results won’t replicate.
You’re expected to write about sources of error. If you don’t, people will write them for you and be more wary of your claims. If you do, people will consider how severe the issues are. If you write them but try to bias them, you’ll have a harder time convincing people who don’t share your biases. And when you leave stuff out, people can notice and then it’s harder for you to answer critics by claiming you totally knew about that and took it into account already.
Even when everyone shares biases, methods like “make a hypotheses. write it down. plan an experiment to test it. write down what results will agree with or disagree with the hypotheses. test it. compare the results to the predictions.” are capable of correcting everyone. That kind of method makes it way harder to fool yourself. If you skip steps like writing things down as you go along, then it’s much easier to fool yourself.
Forums need the same thing. Write down rules in advance. Make the rules totally predictable so people can know in advance what violates them or not. Don’t rely on potentially-biased judgment.
And when you must use judgment at a forum, be like a damn scientist, publish results, talk about sources of error, answer your critics, answer all the questions and doubts, etc. Take that seriously. Discuss the matter to conclusion. At least don’t be and stay wrong if anyone in your community knows you’re wrong and will explain it.
If you can’t resolve discussions, fix that and consider that maybe you’re a bit of a fool. If you don’t know how to get to conclusions with your critics, or manage those discussions without just ignoring arguments without answering, you need better systems and shouldn’t be so high and mighty to proclaim who is a fool.
Ideally, let the fool defend himself, too, Don’t just let others defend him. Topic-limit him during that discussion if you must so he can’t participate in other discussions until it’s resolved.
Also in the article, EY says academia is a walled garden that keeps the fools out, so that’s why ppl are naive and don’t realize they need censors. And I’m like: Yeah that is exactly what academia is and it’s fucking awful there. And academia’s walls are 99% based on social status.
What is your forum doing to prevent social status bias from deciding who the fools are? What explicit policies does it have that could actually work in case that you, the moderators, are social status biased?
EY’s answer is basically “if the mods suck, the forum is fucked”. Just find other, better people to rule. What an awful answer. Seriously that’s his position:
Any community that really needs to question its moderators, that really seriously has abusive moderators, is probably not worth saving.
No! You need good systems, not sinless, unbiased moderators (nor mediocre moderators who aren’t all that bad and you just put up with their errors). It’s like: dear God we’re not going to get unbiased politicians; we need a government system that works anyway. Forums are the same thing. Write out laws in advance. Make new laws with a process. Anyone who doesn’t violate a law in a crystal clear way gets away with it. etc. Otherwise the moderators will have some sort of biases – doesn’t everyone? – and they’re going to oppress the people with other biases who are such a source of intellectual diversity. Broaden your horizons instead of getting rid of all your dissidents and misfits and outliers. God damn you, “overcoming bias” you say?
We know a lot about how to deal with bias, abuse, unfairness, etc. from our legal system and from science. Then people don’t apply those lessons when they have power over a forum.
I have seen rationalist communities die because they trusted their moderators too little.
Here—you must trust yourselves.
You don’t overcome your biases by trusting yourself.
Don’t be a skeptic. Don’t never reach any conclusions or judgments. Don’t be scared to act in your life. But don’t trust your moderation to be unbiased. Have layers and layers of safety valves that provide different opportunities for you to be corrected if you’re wrong or biased. Never trust yourself such that you don’t think you need any mechanisms to keep you honest.
The scientific method makes it harder to be wrong and stay wrong. We need stuff like that for running forums too.
The scientific method does not consider it adequate for scientists to read about how to be good scientists, think about it, discuss it, do their best, and trust themselves. That would be ridiculous.
And the laws have lots of protection against the errors and biases of the individuals who enforce them. E.g. police, judge, jury and executioner are all different people. At forums they’re usually all the same people, which means you need even more safety mechanisms of other types. And we write out clear, objective laws in advance – no retroactive crimes. And there are appeals. And defendants have rights and privileges that the judges have to respect, and that really is enforced in various ways. And we try to limit the discretion of judges to be biased or play favorites by making the law really clear about what they should do. We don’t do that perfectly but we try and it helps. And when we make laws, they are (or at least should be) pretty generic and based on general principles instead of targeting specific individuals or small groups – we want our laws to be relevant for future generations and different societal situations, rather than overly specific to deal some current events.