Former crypto billionaire Sam Bankman-Fried wasn’t just one of the major figures associated with “effective altruism,” or EA, the movement in philanthropy to rigorously hold charitable donations to external measures of effectiveness and impact. He has also exerted significant influence on the movement through his once-prodigious funding capacities.
Bankman-Fried’s rapid fall from grace — with the recent collapse of his crypto exchange FTX, which filed for bankruptcy on Friday — casts doubt on the future of the movement, given his advocacy for it and close relationships with its leading intellectuals and leaders. It also raises questions about whether Bankman-Fried’s EA-influenced beliefs about risk and reward played a role in the decisions that led to FTX’s stunning fall.
The Future Fund, a subset of Bankman-Fried’s charitable apparatus that was largely associated with the related movement known as “longtermism” — the idea that the welfare of future people should weigh heavily on the present, and thus, one of the highest impact actions to take today is to minimize risks of extinction or near-extinction — committed “over $160 million” to a wide range of individuals and organizations aligned with the view. The fund’s five external staffers resigned last week.
A lack of cash is the immediate problem. Some projects will go unfunded. Bankman-Fried did not set up an endowment and instead funded projects as they came, telling the New York Times, “It’s more of a pay-as-we-go thing, and the reason for that, frankly, is I’m not liquid enough for it to make sense to do an endowment right now.”
Some organizations, like the nonprofit journalism outlet ProPublica, said they won’t be receiving the full allotment of their Bankman-Fried funds for reporting on public health threats; Josh Morrison, the founder of 1Day Sooner, an organization that advocates for human vaccine challenge trials, told Grid that FTX made up about 12 percent of the group’s funding and was its fourth-largest funder.
But there’s another issue beyond making up funding shortfalls, one the EA community is beginning to confront in a wrenching, often quite public way. (You can read the forum posts yourself.) What responsibility does the EA community and its ideas, especially around longtermism, bear in motivating Bankman-Fried and FTX’s high-risk, destructive and perhaps illegal financial maneuvering? Did the movement’s own lofty goals shift into a messianism that blinded its members to the risks in their midst?
“Hardly anyone associated with Future Fund saw the existential risk to … Future Fund, even though they were as close to it as one could possibly be,” Tyler Cowen, the EA-adjacent George Mason University economist, wrote on his blog. “I am thus skeptical about their ability to predict existential risk more generally, and for systems that are far more complex and also far more distant.”
The greatest risk to the movement turned out to be Bankman-Fried, a man who employed the movement’s leading philosopher and public face, Will MacAskill. MacAskill is credited with convincing Bankman-Fried to do his part for effective altruism by getting rich and donating what he could. Before FTX’s collapse, the 30-year-old Bankman-Fried was estimated to be worth $16 billion.
“I want to make it utterly clear: if those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community,” MacAskill tweeted last week. “Sam and FTX had a lot of goodwill — and some of that goodwill was the result of association with ideas I have spent my career promoting. If that goodwill laundered fraud, I am ashamed.”
Gambling with the future
MacAskill’s close connection to Bankman-Fried has brought attention to his capacious and overlapping role in the EA community, as both a public figurehead following the publication of his book “What We Owe the Future” and his role in a number of EA organizations, including those funded by Bankman-Fried.
“I think people follow individual leaders, so there was some logic in building his brand as a way of creating a public face for effective altruism for people to connect with emotionally and follow,” said Morrison, referring to MacAskill. “For reasons mostly out of his control, he is less able to do that than he was pre-FTX, though I think his association with cryptocurrency was an understandable but unnecessary risk.”
In the wake of FTX’s collapse, much attention has been paid to its sister trading firm, Alameda Research. FTX lent billions of dollars in customer assets to Alameda, run by Bankman-Friend associate Caroline Ellison; when the news emerged, FTX was brought down by a modern-day bank run. While the details of Alameda’s trading and FTX’s relationship with the firm remain to be excavated by law enforcement, auditors, lawyers and reporters, Bankman-Fried had been relatively open about how he dealt with questions of risk and reward, at least in a theoretical way.
“If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing,” Bankman-Fried told Robert Wiblin, a prominent effective altruist who is the director of research for 80,000 Hours, a nonprofit co-founded by MacAskill that tries to guide young people interested in maximizing human well-being. “If you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns?”
Bankman-Fried argued that philanthropists and those earning money to give it away should be more tolerant of risk: “Your strategy is very different if you’re optimizing for making at least a million dollars, versus if you’re optimizing for just the linear amount that you make.” He said he founded FTX because “there’s well, and then there’s better than well — there’s no reason to stop at just doing well.”
Even the very richest have what’s known as “marginal declining utility of wealth”: The first million dollars does more for their well-being than the next million. The first hundred million is better than the next (there are only so many yachts and so many hours to spend on them). But if you’re earning with the goal of giving your money away to solve big problems that bear on the well-being of millions of living people — and billions or trillions of people not yet born (something EAs think a lot about) — maybe things are different.
“More good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know,” Bankman-Fried said to Wiblin.
He continued: “The expected value of how much impact you have, I think, is going to be a function sort of weighted towards upside tail cases. That’s what I think my prior would be. And if your impact is weighted towards upside tail cases, then what’s that probability distribution of impact probably look like? I think the odds are, it has decent weight on zero. Maybe majority weight.”
In other words, the expected benefits from Alameda and FTX amassing capital and making money were so high that it was OK — maybe even ethically mandated — to accept the risk of losing everything.
“If you see yourself as fighting against this risk of humanity and you see humanity as [lasting] the next trillion years, it’s easy to have a god complex and lose humility as a virtue,” Morrison said.
Double or nothing?
In a now-deleted Tumblr blog, Ellison was even more extreme: “If you abstract away the financial details there’s also a question of like, what your utility function is. Is it infinitely good to do double-or-nothing coin flips forever? Well, sort of, because your upside is unbounded and your downside is bounded at your entire net worth. But most people don’t do this, because their utility is more like a function of their log wealth or something and they really don’t want to lose all of their money. (Of course those people are lame and not EAs; this blog endorses double-or-nothing coin flips and high leverage.)”
If the universe of worthwhile projects for others is greater than the amount of money you can make on a bet, Ellison seems to argue, it makes sense to continually go double or nothing. The most you can lose is your net worth, while the most you can gain is the well-being of millions. Of course, in the end, Bankman-Fried and Ellison lost far more than their own net worth; there’s a gaping hole of some billions of dollars where client funds used to be in FTX.
“There’s an attempt at maximizing whatever quantity they’re interested in. That is one of the risks inherent in that philosophy, you’re hoping to max out on something. Sometimes, according to that calculation, you’re justified in taking high risk,” Carla Zoe Cremer, an Oxford scholar and EA critic, told Grid.
“The kind of questions they’re asking are about the long-term future, which we have no information for. You can tweak on those variables to the extent you can make any argument for whatever,” Cremer said.
Cremer has written a number of papers and EA Forum posts criticizing the longtermist framework around existential risk, has called for governance reforms within the EA community, and specifically warned of the risk of becoming too dependent on a few charismatic — and rich — individuals. “EA needs to diversify funding sources by breaking up big funding bodies and by reducing each orgs’ reliance on EA funding and tech billionaire funding,” she wrote late last year.
Morrison agreed, telling Grid, “I have believed for a while that EA should be more decentralized. … There was a closeness and reliance on crypto combined with a lack of effective governance. I don’t think this disaster could have been avoided, but it probably could have been mitigated with a more decentralized community.”
Alex Wilson, co-founder of the Giving Block, a company that helps nonprofits and others accept donations in Bitcoin and other cryptocurrencies, downplayed the fallout from the downfall of FTX. He noted that crypto philanthropy, with or without SBF and FTX, continues to grow rapidly.
“I would say their influence on the space was actually relatively small, and I don’t think that it really changes anything when it comes to the broader movement of crypto philanthropy,” he said.
But he is worried that regulators will respond by enacting policies that stifle the industry but don’t solve its problems. “Often when you have these kinds of things happen, overly aggressive things get passed that wouldn’t actually address the underlying issue here — this was a failure of [one] person and entity,” said Wilson.
Filling the gaps
Cremer put Bankman-Fried on one side of the EA community, one that’s more comfortable with outlandish hypotheticals about existential risk, one that’s more willing to “bite the bullet” on what their philosophy implies than others. “Someone like Sam is a bit of an archetype for me,” Cremer said.
While some of EA’s more public-facing figures, like MacAskill, have tried to tamp down on this kind of thinking, it still lurks behind any consideration of existential risk. It also may have motivated how Bankman-Fried and other longtermists did philanthropy. While he and his foundations gave to public health organizations and helped fund clinical trials, Bankman-Fried was upfront about what he focused on: longtermism projects.
EA started with the idea that charity could be better optimized to maximize its effectiveness and impact: Donations could save lives, so they should. This led to the work of organizations like GiveWell, founded in 2007, which did calculations on the cost effectiveness of charitable donations. Soon, hundreds of millions of dollars flowed into purchasing bed nets to protect against mosquitoes that transmit malaria, providing children deworming medicine and even giving money directly to very poor people.
But as the movement’s leading intellectuals continued to think about the opportunity to do the greatest good, another goal stood out: preventing human extinction. This would potentially benefit trillions of people whose lives could be snuffed out by nuclear warfare, a devastating pandemic or runaway artificial superintelligence.
One leading EA charitably organization, Open Philanthropy, founded and funded by Facebook co-founder Dustin Moskovitz and his wife Cari Tuna, has essentially split into two groups. One is focused on “Global Health and Wellbeing” — which includes “traditional” EA causes like public health, animal rights and aid to the world’s poor — while the second gives to longtermist causes: biosecurity, pandemics, artificial intelligence and promoting effective altruism.
It’s the latter that ultimately captured the attention of Bankman-Fried and his wallet. (Moskovitz condemned Bankman-Freed in a Twitter thread, describing FTX’s collapse as “infuriating, devastating, and incredibly humbling all at once” while committing to try to “repair the damage Sam did and harden EA against other bad actors … because the stakes remain painfully high.”)
Bankman-Fried’s Future Fund has a dizzying array of grants, but some of the largest go to other EA organizations to support the broader EA community. There was almost $7 million worth of grants for coworking spaces in London and Berkeley, almost $14 million for the Centre for Effective Altruism (which MacAskill helped found), $900,000 “to support prizes for outstanding writing which encourages a broader public conversation around effective altruism and longtermism,” $15 million to Longview and the Long-Term Future Fund, both of which former FTX Foundation president Nick Beckstead advises.
Some in the community expect — or hope — that Open Philanthropy, aka Open Phil, (or billionaires like Stripe founder Patrick Collison) can fill in the funding gaps, but a world where Open Phil is the undisputed primary EA funder will likely be one that’s more rigorous, staid and procedurally buttoned up than the FTX Foundation, which was run by Beckstead and FTX and Alameda employees. Open Phil, on the other hand, lists around 70 staffers.
“The funding environment for EA in general over the next couple years isn’t great (FTX aside) insofar as Facebook stock is down, tech stocks seem likely to go down and there may be a recession,” Morrison said. “So there’s probably whiplash from a lot of people making career decisions based on seemingly reasonable financial expectations that have now been upset.”
But many hope the work will continue. “I think there is going to be (and should be) a buckling down on just doing useful s— and talking less about what ‘effective altruism’ means,” said Morrison. “Which I think will be a healthy thing.”
Benjamin Powers contributed to this story. Thanks to Lillian Barkley for copy editing this article.