The Program Officer for that area gave a talk on South Asian Air Quality at a recent EA conference in India, where he explicitly mentioned that it's a difficult area to work in because it's hard to measure impact - but that we should still do it because the high-level case is so strong!
At that same event, there was also a talk on the importance of economic growth in India.
Happy to share links to those talks when they come online.
> "But the “ineffectiveness” of sponsoring guide dogs to help blind Americans or donating to keep research libraries stocked with obscure titles isn’t a bug. It’s a feature. The diverse enthusiasms of generous people make for a richer cultural environment."
One worry here is that ordinary preferences aren't all that diverse. Most charitable giving in the US goes to churches. Far more goes to cute and cuddly animals than to less-cute ones (including more intelligent animals like pigs, that are routinely tortured on factory farms). Those with funds to donate are more likely to have personal/emotional connections to "first world problems", not with victims of malaria or intestinal parasites.
Which is all just to say that if a diverse giving portfolio is important, then we probably need to *explicitly* target that -- searching for "neglected" opportunities, just as many EAs recommend -- and not just defer to unreflective preferences.
When I say that EA does good on the margin, this is the kind of thing I have in mind. But, given its audience, it's more likely to reduce funding for higher education or the arts than for churches or puppy rescue. People who make it big on Wall Street or in Silicon Valley aren't the ones putting money in the collection plate on Sundays or responding to direct mail solicitations with cute animals. I think it's fair to say that EA would seek to stigmatize endowing programs at wealthy universities (state HBCUs might be OK) or funding ballet companies, especially if they hold galas rather than funding malaria prevention.
I think a you are attacking a weakman in your attack. "Effective Altruism suffers from the blind spots that are characteristic of highly intelligent, self-described rationalists: hubris and a fixation on counting things. It assumes that it’s easy to tell what will do good and that the only way of “doing good” is directly extending life expectancy."
Firstly there is a whole "long termist" EA crowd. Are they pushing to accelerate economic growth. By and large, they aren't. Why? Because on million years timescales, a few years faster or slower doesn't make that much difference compared to the risk of extinction.
A defense of the "fixation on counting things" would be that the world is complicated. There are a huge number of possible interventions, and I am not capable of measuring the effectiveness of most of them. Given a million interventions, 1000 of which you can accurately measure and the rest you have no clue about, your best strategy is to pick the best one out of the 1000. Will it be the best strategy overall. Probably not. But you don't have a good way to find a better one. From an organizational design point of view, organizations are worse at pursuing hard to measure goals. Organizations without clear signals for how well they are doing are more at risk of degenerating without it being noticed. Fixing the easily measured problems might be the best you can do in practice.
An alien supergenius altruist would go around taking all sorts of inscrutable actions that improved the future in convoluted ways. But we aren't alien super geniuses. So we must confine ourselves to actions with consequences so obvious, even a human can work out what they are.
Actually I don't buy the "tourist attraction" argument. Which other city would those tourists have visited otherwise.
And it seems unlikely that "selfish pursuit of luxuries" is the best way to make the world better off in the long term.
Excellent essay. The act of giving, a simple thing, seeking to be complex, rational, effective. Perhaps the concern for “ineffectiveness” is more an ego trip taken by those who need an intellectual reward in order to feel okay about setting their money free, the need to feel in control being their primary motivator and guide.
Re "you don’t get economic growth from a philosophy that tells people they are morally culpable for countless deaths if they consume anything more than absolutely necessary":
I don't personally find EA compelling, but in fairness, one of their stated reasons for setting 10% of income as a goal for donations is to prevent the overscrupulous among them from going off that particular deep end, by telling them there's such a thing as "enough" long before they sell their second to last shirt to buy bednets.
Fair enough. I get most of what I know about EA via Scott Alexander at Astral Codex Ten, so I may be getting a skewed view.
(Plus various news bits like the Center for Effective Altruism buying a castle for EA conferences, which suggests that human nature may well be able to wrap EA around to justify things other than bednets and AI alignment research for better or worse.)
Thanks, I thought this was a reasonable critique of EA!
That said, I think it's wrong to say that EA ignores air pollution in India or economic growth. Open Philanthropy (one of the only large foundations which acts on EA principles) is one of the largest donors to reducing air pollution in India (https://www.openphilanthropy.org/research/incoming-program-officers-for-south-asian-air-quality-and-global-aid-policy/).
The Program Officer for that area gave a talk on South Asian Air Quality at a recent EA conference in India, where he explicitly mentioned that it's a difficult area to work in because it's hard to measure impact - but that we should still do it because the high-level case is so strong!
At that same event, there was also a talk on the importance of economic growth in India.
Happy to share links to those talks when they come online.
Santosh Harish's talk on South Asian Air Quality is now live :)
https://www.youtube.com/watch?v=wbNGui-49CE
> "But the “ineffectiveness” of sponsoring guide dogs to help blind Americans or donating to keep research libraries stocked with obscure titles isn’t a bug. It’s a feature. The diverse enthusiasms of generous people make for a richer cultural environment."
One worry here is that ordinary preferences aren't all that diverse. Most charitable giving in the US goes to churches. Far more goes to cute and cuddly animals than to less-cute ones (including more intelligent animals like pigs, that are routinely tortured on factory farms). Those with funds to donate are more likely to have personal/emotional connections to "first world problems", not with victims of malaria or intestinal parasites.
Which is all just to say that if a diverse giving portfolio is important, then we probably need to *explicitly* target that -- searching for "neglected" opportunities, just as many EAs recommend -- and not just defer to unreflective preferences.
When I say that EA does good on the margin, this is the kind of thing I have in mind. But, given its audience, it's more likely to reduce funding for higher education or the arts than for churches or puppy rescue. People who make it big on Wall Street or in Silicon Valley aren't the ones putting money in the collection plate on Sundays or responding to direct mail solicitations with cute animals. I think it's fair to say that EA would seek to stigmatize endowing programs at wealthy universities (state HBCUs might be OK) or funding ballet companies, especially if they hold galas rather than funding malaria prevention.
I think a you are attacking a weakman in your attack. "Effective Altruism suffers from the blind spots that are characteristic of highly intelligent, self-described rationalists: hubris and a fixation on counting things. It assumes that it’s easy to tell what will do good and that the only way of “doing good” is directly extending life expectancy."
Firstly there is a whole "long termist" EA crowd. Are they pushing to accelerate economic growth. By and large, they aren't. Why? Because on million years timescales, a few years faster or slower doesn't make that much difference compared to the risk of extinction.
A defense of the "fixation on counting things" would be that the world is complicated. There are a huge number of possible interventions, and I am not capable of measuring the effectiveness of most of them. Given a million interventions, 1000 of which you can accurately measure and the rest you have no clue about, your best strategy is to pick the best one out of the 1000. Will it be the best strategy overall. Probably not. But you don't have a good way to find a better one. From an organizational design point of view, organizations are worse at pursuing hard to measure goals. Organizations without clear signals for how well they are doing are more at risk of degenerating without it being noticed. Fixing the easily measured problems might be the best you can do in practice.
An alien supergenius altruist would go around taking all sorts of inscrutable actions that improved the future in convoluted ways. But we aren't alien super geniuses. So we must confine ourselves to actions with consequences so obvious, even a human can work out what they are.
Actually I don't buy the "tourist attraction" argument. Which other city would those tourists have visited otherwise.
And it seems unlikely that "selfish pursuit of luxuries" is the best way to make the world better off in the long term.
Excellent essay. The act of giving, a simple thing, seeking to be complex, rational, effective. Perhaps the concern for “ineffectiveness” is more an ego trip taken by those who need an intellectual reward in order to feel okay about setting their money free, the need to feel in control being their primary motivator and guide.
Excellent/fair critique.
Re "you don’t get economic growth from a philosophy that tells people they are morally culpable for countless deaths if they consume anything more than absolutely necessary":
I don't personally find EA compelling, but in fairness, one of their stated reasons for setting 10% of income as a goal for donations is to prevent the overscrupulous among them from going off that particular deep end, by telling them there's such a thing as "enough" long before they sell their second to last shirt to buy bednets.
Depends on what you mean by "they," a tricky question when dealing with a diverse movement. Peter Singer considers 10% a good start, not an end goal.
Fair enough. I get most of what I know about EA via Scott Alexander at Astral Codex Ten, so I may be getting a skewed view.
(Plus various news bits like the Center for Effective Altruism buying a castle for EA conferences, which suggests that human nature may well be able to wrap EA around to justify things other than bednets and AI alignment research for better or worse.)
All that intellectualizing by EA just to reinvent the concept of tithing!
A fine analysis of a complex problem of very longstanding.
Harvey Silverglate, Cambridge, MA