The spectacular collapse from cryptocurrency trading firm FTX has raised a number of pressing questions. Why did the founder, Sam Bankman-Fried, receive such media coverage? Will his customers get their crypto back? Oh, and should wealthy philanthropists in the United States spend their money on buildings for their alma mater, screens halfway around the world, or preventing global disasters in the distant future?
This last question is relevant because Bankman-Fried was one of the biggest financial backers and media promoters of effective altruismin his own words, “a field of research and a community of practice that aims to find the best ways to help others and to put them into practice.” EA is all about studying various charities, determining which ones do the most good, and donating money to them. It has also become an influential subculture in the Bay Area, where worshipers generally refer to themselves as effective altruists in much the same way they might describe themselves as leftists or leftists. psychonauts.
The fact that the public face of EA was the leader of a clique of Millennial super nerds apparently running a multi-billion dollar Ponzi scheme from a penthouse in the Bahamas naturally tainted the movement. A number of charities are expecting hundreds of millions of dollars in donations. Some donors wonder whether to be involved with EA at all. “Effective altruism posits that making money by (almost) any means necessary is okay because you…are so brilliant that you absolutely should have all the power involved with billions of dollars in the bank,” said the CoinDesk columnist David Z. Morris. arguea sentiment repeated time and time again online.
Yet this crisis also creates an opportunity. Effective altruism, the movement, is not the same as effective altruism, the practice of financially maximalist, rigorously data-driven philanthropy. Separating the second from the first would benefit everyone on the planet.
EA began merging in the 2000s, when Oxford philosophers Toby Ord and William MacAskill, with Bernadette Youngfounded Giving What We Can, a group whose members I agree to “donate a significant portion of their income to profitable charities”. A number of other think tanks and research centers followed, along with an active online community. The main intellectual source of the movement was the work of the utilitarian philosopher Peter Singer. We can help each other. We should help each other. We must help each other, his philosophy advocated. How should we do it? As much as we can, as efficiently as possible. “He combines both the heart and the head,” Singer said of EA in a much-watched TED Conference 2013.
EA encourages everyone who can to donate as much of their wealth as possible, whether 10 percent their income or anything over a certain amount. More controversially, he suggests that people earn to give– by working, say, on Wall Street and donating money, rather than working hard in socially responsible but unpaid work. MacAskill himself encouraged Bankman-Fried to make millions, which led him to get into high-frequency trading and then into crypto. (The irony is rich: a movement dedicated to fighting poverty encouraged its adherents to become as wealthy as possible.)
EA maintains that all people are equal; thus, donors should not prioritize people who share their interests, background or nationality. Concretely, he tries to understand how to do the most good for the greatest number, then advises donors on where to send their money.
This focus on results is a very good thing, and nonprofits such as GiveWell and Open Philanthropy have helped make big-budget philanthropy more accountable, transparent, and effective. Many charities spend huge amounts on overhead and do little measurable good. In some cases, nonprofit organizations harm the communities they want to help; research shows that donated American clothing, for example, harms the textile trade in sub-Saharan African countries and overload their landfills.
And a lot of charitable giving is about the pride of the giver, rather than the needs of the recipient — making a name for yourself at an Ivy League college gym rather than helping children with diarrheal diseases. on another continent. Donating a million dollars to, say, a youth sports league in the country where your grandparents grew up might seem like a good thing to do. Surely that would be better than buying a yacht for yourself. But those dollars would do more good if distributed as cash grants to refugees or spent on antimalarial bed nets. The EA movement has caused a lot of people to see this logic, and therefore to hire an estimate $46 billion to unsexy but important initiatives.
Yet the movement is insular. His demography very young, very masculine, very white, very educated and very socio-economically privileged bias. A lot of people at EA come from a tech background; many also see themselves as “rationalists”, interested in applying Bayesian reasoning in all possible situations. EA has a culture, and that culture is corny, serious, and moral. It is also, at least in my many relationships with the people at EA, too intellectual, performative, even onanistic.
It’s perhaps unsurprising that EA’s focus has quirky from poverty and towards more esoteric concerns in recent years. The movement became captivated by something called “long-termismwhich amounts to giving priority to the distant, very distant future. Leave thousands of children
dying today from preventable causes linked to poverty is terrible, of course, but wouldn’t it be worse if billions of people never lived because of the ravages of an as yet uninvented weapon? Yes, according to some form of utilitarian logic. And the money followed this logic: Bankman-Fried himself invested $160 million in a fund to tackle, among other things, the dangers of synthetic biology, the promise of space governance, and the damage artificial intelligence could inflict on humanity many years from now.
Long-termism is right about one thing: we underestimate the future. The world would be a better place today if philanthropists had invested heavily in pandemic preparedness or preventing global warming 30 years ago. But a lot of EA’s thoughts on the far future are fantastic. Some proponents of the long term, for example, argue that we need to strike a balance between the need to tackle climate change now and the need to invest in colonize space; they encourage us to think on the scale of a billion years.
The FTX debacle clearly demonstrates the problem with this kind of mentality, as economist Tyler Cowen put it. Noted. No one at Bankman-Fried’s multimillion-dollar philanthropic fund seemed to realize that the risk emanating from the Bahamas today was more pressing than anything space lasers could do tomorrow. “So I’m skeptical of their ability to predict existential risk more generally, and for much more complex and also much more distant systems,” Cowen writes. “It turns out that many of the real sources of existential risk come down to human hubris, frailty, and imperfection.” Indeed, EA seems to have ended up committing the sin it was meant to correct in traditional philanthropy: it got lost in the vainglory of its irresponsible donors and overlooked the real issues of the real world.
The task of making altruism effective is too important to be left to effective altruists, and they have no particular claim to it. EAs didn’t have the idea to find the best value in giving money to the world’s poor, after all. Indeed, the EA revolution borrows its techniques from therandommovement in development economics – which subjects policy interventions to randomized controlled trials – and proponents of simple cash transfers as a solution to poverty in the Global South. The whole thing is, in part, a rebranding exercise.
The fall of Bankman-Fried should trigger another rebranding and sorting between good EA and bad EA. Encourage people to donate their money? Great. Become a billionaire to give away your money? A terrible idea. Making some of the world’s richest white people care about the world’s poor? Fantastic. Convince these same guys that they know best how to take care of all of humanity? Lord help us. A bigoted team of tech self-promoters shouldn’t be the public face of charitable accountability and effectiveness. Don’t let them.