[Previous] Criticizing "Against the singularity hypothesis" | Home | [Next] Misquoting and Scholarship Norms at EA

Organized EA Cause Evaluation

I wrote this for the Effective Altruism forum. Link.

Suppose I have a cause I’m passionate about. For example, we’ll use fluoridated water. It’s poison. It lowers IQs. Changing this one thing is easy (just stop purposefully doing it) and has negative cost (it costs money to fluoridate water; stopping saves money) and huge benefits. That gives it a better cost to benefit ratio than any of EA’s current causes. I come to EA and suggest that fluoridated water should be the highest priority.

Is there any *organized** process by which EA can evaluate these claims, compare them to other causes, and reach a rational conclusion about resource allocation to this cause?* I fear there isn’t.

Do I just try to write some posts rallying people to the cause? And then maybe I’m right but bad at rallying people. Or maybe I’m wrong but good at rallying people. Or maybe I’m right and pretty good at rallying people, but someone else with a somewhat worse cause is somewhat better at rallying. I’m concerned that my ability to rally people to my cause is largely independent of the truth of my cause. Marketing isn’t truth seeking. Energy to keep writing more about the issue, when I already made points (that are compelling if true, and which no one has given a refutation of), is different than truth seeking.

Is there any reasonable on-boarding process to guide me to know how to get my cause taken seriously with specific, actionable steps? I don’t think so.

Is there any list of all evaluated causes, their importance, and the reasons? With ways to update the list based on new arguments or information, and ways to add new causes to the list? I don’t think so. How can I even know how important my cause is compared to others? There’s no reasonable, guided process that EA offers to let me figure that out.

Comparing causes often depends on some controversial ideas, so a good list would take that into account and give alternative cause evaluations based on different premises, or at least clearly specify the controversial premises it uses. Ways those premises can be productively debated are also important.

Note: I’m primarily interested in processes which are available to anyone (you don’t have to be famous or popular first, or have certain credentials given to you be a high status authority) and which can be done in one’s free time without having to get an EA-related job. (Let’s suppose I have 20 hours a week available to volunteer for working on this stuff, but I don’t want to change careers. I think that should be good enough.) Being popular, having credentials, or working at a specific job are all separate issues from being correct.

Also, based on a forum search, stopping water fluoridation has never been proposed as an EA cause, so hopefully it’s a fairly neutral example. But this appears to indicate a failure to do a broad, organized survey of possible causes before spending millions of dollars on some current causes, which seems bad. (It could also be related to the lack of any way good way to search EA-related information that isn’t on the forum.)

Do others think these meta issues about EA’s organization (or lack thereof) are important? If not, why? Isn’t it risky and inefficient to lack well-designed processes for doing commonly-needed, important tasks? If you just have a bunch of people doing things their own way, and then a bunch of other people reaching their own evaluations of the subset of information they looked at, that is going to result in a social hierarchy determining outcomes.

Elliot Temple on November 28, 2022


Want to discuss this? Join my forum.

(Due to multi-year, sustained harassment from David Deutsch and his fans, commenting here requires an account. Accounts are not publicly available. Discussion info.)