615 Comments
⭠ Return to thread

If your point is that specific, well-researched criticism is harder than general, paradigmatic criticism, I agree. I think that's why you tend to see more of the latter, though much of it is low-quality.

If your point is that paradigmatic criticism (or this specific paradigmatic criticism) is without value, I strongly and specifically disagree.

I admittedly haven't read any of the other entries, but I would be happy to see Zvi win (at least some of the prize pool of) this contest. I briefly considered entering this contest, but was put off for the same reasons he expresses in his post.

To distill what he's trying to say: Imagine if the Catholic Church had an essay-writing contest asking to point out the Church's sins. But then, in the fine print, they strongly implied that they will be judging what is a sin based on the teachings of Jesus Christ, and that it would be judged by a select group of Cardinals. That would drive away anyone trying to point out cases where their interpretations of Jesus's teachings might be wrong, or where the teachings of Jesus don't work on a fundamental level.

This is the same deal. The criticism contest asks for criticism, but then implies that it's going to be judged within EA's interpretation of utilitarianism, thus pushing away any potential criticism of the fundamentals.

Could most places stand to be a bit more utilitarian? Sure! Most places could also stand to follow the teachings of Jesus a bit more closely. Those are both in the general vicinity of "good" in my book, if bounded by general common sense.

But both of them have problems, or at least diverge, if you take them to the extreme. You know this and wrote about it in a post of yours, which I think about a lot. [1] That's when you start getting things like Zvi describes, of non-vegans being treated "as non-serious (or even evil)".

Another red flag is EA focusing on "community building" as a core focus area. You can easily torture utilitarianism into justifying that: sure, you could research malaria cures yourself, or you could talk to ten undergrads and convince them to go into malaria research, and get ten times the probability of success!

Meanwhile everyone starts thinking you're a cult. [2] And they're not… totally wrong? EA isn't yet a cult, but is arguably becoming a religion, even more so than the way "every social movement is a religion". It's built on a core moral foundation (utilitarianism), does free distribution of holy books, [3][4] and has convinced itself that missionary work is of the utmost importance. (Seriously, please read [2].)

And what tends to happen to religions? They tend to start believing in their own importance a bit too much, sometimes at the expense of actual social good. They're at risk of being captured by people that are more interested in improving their social status than actually making the world a better place. They have a tendency towards purity spirals that take their morality farther and farther into Extremistan.

If (as Zvi suggests) we're at the point where people who might be able to work in AI safety or research a cure for malaria or whatever are being treated poorly because they eat chicken, then that's a red flag that EA is starting to fall into these traps.

When you attack a religion, you've got to attack its roots. Not "This person isn't following Jesus properly," but "There is no God and it's absurd to think that there is."

The problem is, this usually results in the destruction of the movement.

How can EA survive this? I think if it took a diminished view of its own importance, you could still salvage a lot from it.

Instead of convincing a large number of people to be good little EAs/utilitarians, have only a small number of core utilitarians bringing up potential cause areas for broader consideration. This is what GiveWell does, and it works pretty well. Their top recommended charities are hard to argue with, even if you don't buy into the overall utilitarian bent behind their work.

Instead of recruiting undergrads to EA as a whole, try to recruit them to explicitly work in specific cause areas you think they may be well-suited for and are understaffed.

To borrow from a different religion's sacred texts: the goal is to cut your enemy. The goal of EA should be to move the needle on these cause areas, not to move the needle on the acceptance of EA or utilitarianism more broadly.

…I guess I sort of ended up writing that contest entry in this comment.

[1]: https://slatestarcodex.com/2018/09/25/the-tails-coming-apart-as-metaphor-for-life/

[2]: https://forum.effectivealtruism.org/posts/xomFCNXwNBeXtLq53/bad-omens-in-current-community-building

[3]: https://80000hours.org/the-precipice/

[4]: https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-recommendations

Expand full comment

> And what tends to happen to religions? They tend to start believing in their own importance a bit too much, sometimes at the expense of actual social good. They're at risk of being captured by people that are more interested in improving their social status than actually making the world a better place. They have a tendency towards purity spirals that take their morality farther and farther into Extremistan.

Which religious groups (in the narrow sense of 'religion') are more morally extreme now than they were in their first few generations?

Expand full comment
Jul 20, 2022·edited Jul 20, 2022

Now is the wrong time frame because of religion's general loss of status and power. Instead, "which religious groups ever became more morally extreme than they were in their first few generations?"

And the answer is clearly at least Christianity (the inquisition) and Islam (holy wars), and arguably Buddhism (as practiced in Japan in various later eras).

Expand full comment

I would point out that the form of Buddhism you're citing (True Pure Land Buddhism) is in fact LESS morally extreme (recitation of nembutsu is seen as the only attainable virtue, and the recitation of nembutsu 10,000 times is all that is required for salvation) and also engages in so many extreme doctrinal variations that it's arguably a heretical form of Buddhism.

...I say, as a follower of a Vajrayana tradition, which is often seen in the same light.

Expand full comment

I wasn't citing True Pure Land specifically (as I'm only dimly aware of it as a distinct sect, and just have a general understanding that there were many sects at this time), but I do agree that various flavors of Japanese Buddhism seem to be seen as heretical outside Japan.

Expand full comment

My apologies, in my experience it and Zen are the only forms of Japanese Buddhism people know.

I'll also comment that Zen is one of the few Japanese Buddhist sects that has wider acceptance outside of Japan because it strongly derives from Chan, a "legitimate" fundamentalist sect within the Mahayana tradition, which is the predominant one. Most other sects (Shingon, Nichiren, Pure Land) are seen as "heretical" because they have a strong influence from Vajrayana or "Esoteric Buddhism", which rejects some pretty fundamental ideas in the other schools like non-violence and celibacy being absolute requirements to attaining Nirvana (unsurprisingly, the places where Vajrayana caught on had strong warrior aristocracies who liked the idea of killing unrepentant bandits and wars of conquest by a "humane king" as part of the dharma).

True Pure Land is rather infamous because they took the idea of "Buddhist teachings being spread by the sword" to its logical endpoint and tried to establish a Buddhist theocracy, as well as rejecting all moral law beyond nembutsu and adherence to the teachings of the priesthood due to their doctrine that it was impossible to attain complete salvation in the current age.

Expand full comment

> That's when you start getting things like Zvi describes, of non-vegans being treated "as non-serious (or even evil)".

I think this happens anywhere you get a critical mass of vegans.

Expand full comment

Thank you, this is great, and I'm saving it.

Another red flag (for me at least) was EA causing Scott to publicly wonder whether he should be an EA worker rather than a doctor. Pretty much everyone agrees on the basic positive value of a well meaning doctor, and not hewing to this standard is a sign of a purity spiral. It's certainly the reason I never pointed my (doctor, good doer) brother at EA.

Expand full comment
founding

Doctors are basically worthless on a global or historical scale. This is just an indisputable statistical truth, not a sign of a purity spiral. The impact of the average doctor is just never gonna be that big, even in the best-case scenario where they diligently save 1-10 lives every day.

Being a doctor is, like you say, basic. The point of EA is to ask if you can do better than basic.

Expand full comment
Jul 20, 2022·edited Jul 20, 2022

The sign of purity spiral isn't the argument you've made or asking undifferentiated people to consider it. The sign of purity spiral is asking a newly minted doctor to switch.

It's not very important to the grand movement that this particular person come on board, but it does show a careless disregard for switching costs that's well associated with destructive purity spirals of the past. It's very French Republican Calendar.

"Disruptive" is a swear word in some circles, and I'm not just talking about old fashioned taxi folks. I'm talking about rationalist west coast academic tech folks.

Expand full comment
founding

Sorry, that's ridiculous. You can't ask "undifferentiated people" anything! You're always asking specific people, because only real specific people read blogposts or essays. There's no such thing as "undifferentiated people".

So it's absurd to say a community is purity spiraling for "asking" a doctor to change careers just because he's read posts that make him consider it.

Expand full comment
Jul 20, 2022·edited Jul 20, 2022

I thought this was clear from context, but I was wrong, so let me clarify: undifferentiated in the relative sense that they haven't invested 10 years of specialized effort.

Scott didn't read posts that made him consider changing. He wrote about visiting and speaking to EAs who told him in person that it would be better if he switched to being an EA.

Edit: It's fine to ask an 18 year old to be an EA because you think EA is best. It's blithe and callous to *tell* a useful specialist that they should do your thing instead. More than blithe, actually. Downright street preachery.

Expand full comment
author

I didn't feel pressured by them. I think I wrote about how I met an EA career counselor who had previously been a doctor but switched to his current job once he realized it was higher impact, and that made me think about the topic. No preaching involved.

I also think that preaching would have been a ... social faux pas .... but that it's important to distinguish "social faux pas" from "objectively incorrect". If you are an atheist, it's a social faux pas to walk into a church and start telling them they're all wrong and dumb and they need to believe in evolution, but this isn't a strike against the truth of atheism, just against the social mores of that particular atheist.

Expand full comment
Jul 21, 2022·edited Jul 21, 2022

Thank you, I misinterpreted and I'll update on their pressure level!

"Social faux pas" and "objectively incorrect" are different but they definitely do have a relationship when you're interacting with new people. There is typically a threshold of social faux pas where you'll likely quietly adjust your best guess of the chance that the speaker is offering you objectively correct information. I will keep using the example of street preachers, but to help illustrate the idea, this group also includes untrained people with heterodox science ideas who are certain enough to corner you and push those ideas without knowing or caring that that's a faux pas.

It's not the same as object level debate about the merits of EA (or the merits of debating the merits of EA), but I think it's worth pointing out as a side note that a lot of the dispute between the extremes of feeling for EA come down to some people's threshold being tripped and some not. This makes it particularly helpful to hear that the EA career counselor wasn't using pressure. I have my own interactions with EA, but at least I can wind back "writer I follow was pressured in what I consider an inappropriate way", which makes them seem less like the kind of seriously insistent group that are dangerous to hear out. Thanks.

Expand full comment

Did he bring hard to replace skills to careers counselling? What about the wasted cost of his medical training?

Expand full comment

Have you considered if the average doctor will likely do more good than the average EA?

I think how many "rationally" approach "world saving" is that they estimate a 5% chance of doing really really good globally, and the impact could be so good that they can't waste their time with local issues. But the estimate is pure air and if it's really 0.001%, they essentially didn't do anything.

Expand full comment

I think it's fine if the Catholic Church only wants criticism within the Christian framework. If you think the teachings of Jesus Christ are bad and wrong, you should not be trying to convince the Catholic Church to abandon them in favor of something better; you should be trying to convince people to abandon the Catholic Church.

On the other hand, even as a non-Catholic you can think, "The Church is an organization that does some good things and some bad things. Some of the bad things are caused by a fundamental difference in our values, but there are some cases where I think they are just making a mistake by their own lights. I will write an essay pointing out those cases."

Expand full comment
author

I think it's fair to want advice on how best to do the thing you're doing, rather than to be told you should do a totally different thing.

Although I am not a perfect doctrinaire utilitarian, I'm pretty close and I feel like I have reached a point where I'm no longer interested in discussion about how even the most basic intuitions of utilitarianism are completely wrong - that feels close to something like "morality is dumb and you shouldn't care about it". While this is a philosophically coherent position, it almost feels more like an aesthetic/emotional choice to care about morality, to the point where I would be surprised if a logical argument could talk me out of it. Although obviously if there were a great argument that could talk me out of utilitarianism I would want to hear it, I feel like this is unlikely enough that it would be unfair to promise people a prize for coming up with good arguments in that direction, when I'm so unlikely to think an argument in that direction is good.

Expand full comment

Not advocating (ie, it's perfectly fine to skip), but have you considered satisficing consequentialism?

https://www.princeton.edu/~ppettit/papers/1984/Satisficing%20Consequentialism.pdf

Expand full comment
author

I'm not going to read that whole paper now, so sorry if I'm addressing a straw man, but if you asked me "would you rather cure poverty for one million people, or for one million and one people", I am going to say the one million and one people, and I feel like this is true even as numbers get very very high. Although satisficing consequentialism is a useful hack for avoiding some infinity paradoxes, it doesn't really fit how I actually think about ethics. "The ethical thing is to pile exactly five pebbles on top of each other, then stop" is incredibly consistent and paradox-avoid-y, but at some point you have to actually satisfy your moral intuitions.

Expand full comment
Jul 24, 2022·edited Jul 24, 2022

No, I think that's a pretty fair man to address, and a pretty fair address. Satisficing consequentialism only makes sense as a system if it fits a sort of natural preference. I will point out that for a lot of people it's very much like https://astralcodexten.substack.com/p/i-will-not-eat-the-bugs .

Expand full comment