The main reasons I tried to talk with EA are:
- they have a discussion forum
- they are explicitly interested in rationality
- it's public
- it's not tiny
- they have a bunch of ideas written down
That's not much, but even that much is rare. Some groups just have a contact form or email address, not any public discussion place. Of the groups with some sort of public discussion, most now use social media (e.g. a Facebook group) or a chatroom rather than having a forum, so there’s no reasonable way to talk with them. My policy, based on both reasoning and also past experience, is that social media and chatrooms are so bad that I shouldn’t try to use them for serious discussions. They have the wrong design, incentives, expectations and culture for truth seeking. In other words, social media has been designed and optimized to appeal to irrational people. Irrational people are far more numerous, so the winners in a huge popularity contest would have to appeal to them. Forums were around many years before social media, but now are far less popular because they’re more rational.
I decided I was wrong about EA having a discussion forum. It's actually a hybrid between a subreddit and a forum. It's worse than a forum but better than a subreddit.
How good of a forum it is doesn’t really matter, because it’s now unusable due to a new rule saying that basically you must give up property rights for anything you post there. That is a very atypical forum rule; they're the ones being weird here, not me. One of the root causes of this error is their lack of understanding of and respect for property rights. Another cause is their lack of Paths Forward, debate policies, etc., which prevents error correction.
The difficulty of correcting their errors in general was the main hard part about talking with them. They aren't open to debate or criticism. They say that they are, and they are open to some types of criticism which don't question their premises too much. They'll sometimes debate criticisms about local optima they care about, but they don't like being told that they're focusing on local optima and should change their approach. Like most people, they each of them tends to only want to talk about stuff he knows about, and they don't know much about their philosophical premises and have no reasonable way to deal with that (there are ways to delegate and specialize so you don't personally have to know everything, but they aren't doing that and don't seem to want to).
When I claim someone is focusing on local optima, it moves the discussion away from the topics they like thinking and talking about, and have experience and knowledge about. It moves the topic away from their current stuff (that I said is a local optima) to other stuff (the bigger picture, global optima, alternatives to what they’re doing, comparisons between their thing and other things).
Multiple EA people openly, directly and clearly admitted to being bad at abstract or conceptual thinking. They seemed to think that was OK. They brought it up in order to ask me to change and stop trying to explain concepts. They didn’t mean to admit weakness in themselves. Most (all?) rationality-oriented communities I have past experience with were more into abstract, clever or conceptual reasoning than EAers are. I could deal with issues like this if people wanted to have extended, friendly conversations and make an effort to learn. I don’t mind. But by and large they don’t want to discuss at length. The primary response I got was not debate or criticism, but being ignored or downvoted. They didn’t engage much. It’s very hard to make any progress with people who don’t want to engage because they aren’t very active minded or open minded, or because they’re too tribalist and biased against some types of critics/heretics, or because they have infallibilist, arrogant, over-confident attitudes.
They often claim to be busy with their causes, but it doesn’t make sense to ignore arguments that you might be pursuing the wrong causes in order to keep pursuing those possibly-wrong causes; that’s very risky! But, in my experience, people (in general, not just at EA) are very resistant to caring about that sort of risk. People are bad at fallibilism.
I think a lot of EAers got a vibe from me that I’m not one of them – that I’m culturally different and don’t fit in. So they saw me as an enemy not someone on their side/team/tribe, so they treated me like I wasn’t actually trying to help. Their goal was to stop me from achieving my goals rather than to facilitate my work. Many people weren’t charitable and didn’t see my criticisms as good faith attempts to make things better. They thought I was in conflict with them instead of someone they could cooperate with, which is related to their general ignorance of social and economic harmony, win/wins, mutual benefit, capitalism, classical liberalism, and the criticisms of conflicts of interest and group conflicts. (Their basic idea with altruism is to ask people to make sacrifices to benefit others, not to help others using mutually beneficial win/wins.)