Elon Musk tweeted Saturday a ChatGPT conversation that speculated about the 2019 transition of its creator, OpenAI, from a nonprofit to a for-profit organization. The AI chatbot concluded that, if the forprofit business had used the nonprofit’s resources for the change, it would have been “highly unethical and illegal.”
It appears that Musk and ChatGPT didn’t have all the facts. Tax filings seen by TechCrunch indicate the original OpenAI nonprofit retained control over all of its financial assets, totaling tens of millions of dollars, meaning none of its money was used to spin out the organization’s commercial enterprises.
The interesting part is where that money ended up: financing Universal Basic Income pilots aiming to fix the very problems OpenAI’s technologies seem to be creating.
And that’s just one thread in a web of commercial investments and nonprofits that all tie back to Sam Altman, best known as a co-founder of startup accelerator Y Combinator and OpenAI — the nonprofit he started with Musk.
His investments span a dozen industries, from nuclear fusion and supersonic planes to molecular diagnostics and crypto, but key among his wider interests are a collection of nonprofits, run by Altman and his close friends.
The story of this family of nonprofits illustrates how a small group of like-minded entrepreneurs can leverage their charitable donations to not only support their personal causes, but to further commercial interests and possibly even accelerate the transformation of society.
A web of nonprofits
It’s far from unusual for tech entrepreneurs to have a charitable foundation or two to distribute their wealth exactly how they wish. But Altman’s commercial and charitable dealings are more intertwined than most.
Altman controls at least two nonprofits, OpenAI and OpenResearch, and has provided funding to a third, not previously reported, known as UBI Charitable.
UBI Charitable’s mission is to research and deploy Universal Basic Income (UBI) programs — the no-strings-attached payouts scheme that futurists like Altman and Musk believe will be necessary when advances in robotics and AI, similar to those being developed by the two technologists, render many human occupations unprofitable. It is already funding at least two UBI schemes.
Understanding the connections and the flows of money between Altman’s businesses and charities entities means going back to 2015.
That was the year that Altman co-founded OpenAI with Musk, Reid Hoffman and others, as a 501c3 organization to safely and transparently pursue AI research. It was also the year he spun out a separate nonprofit research lab from Y Combinator that would ultimately be called OpenResearch. This research lab was launched to tackle work that required a very long time horizon, sought to answer open-ended questions or develop technology that Altman thought should not be owned by any one company.
“We’re not doing this with the goal of helping YC’s startups succeed or adding to our bottom line,” wrote Altman on Y Combinator’s blog at the time. “At the risk of sounding cliché, this is for the benefit of the world.”
He claimed in the blog that he would start off by personally donating $10 million to OpenResearch and raise more money later.
A filing with the IRS shows that the lab in fact received only $1 million in donations in 2016. Funding for OpenResearch initially lagged, but would eventually top $10 million by 2019. The source of that money was not specified. OpenResearch has received a total of nearly $24.5 million in funding since it was established, according to tax filings. Altman also provided a $5.2 million loan to the organization in 2016, and increased that year by year. Altman had loaned OpenResearch a total of $14 million by the end of 2021, according to the latest records (although he has forgiven some of the debt).
The 2016 filing also claimed that OpenResearch had already made “significant progress” in such diverse areas as programming languages, simulation systems, physical/virtual user interfaces, computer-mediated student-teacher interaction and virtual reality.
OpenResearch kept a low profile in its early years. That changed with the COVID-19 pandemic.
In March 2020, as the virus was shutting down America, Altman tweeted a call for help with clinical trials of potential therapies, that connected him to computational biologist Benjamine Liu, a founder of TrialSpark.
OpenResearch provided TrialSpark with a $1 million grant to help set-up Project Covalence, a platform to support COVID-19 trials in community settings or at patients’ homes. The project’s website stated: “The world doesn’t have time to waste. By coordinating efforts, sharing resources, and streamlining logistics, we can halt the spread of COVID-19 together.”
At least one trial did take place, not for an actual therapy, but for a remote diagnostic test for COVID antibodies. The trial in the summer of 2020 was a success, gathering high-quality samples and positive feedback from participants.
And yet, by late summer 2021, Project Covalence’s website disappeared. Not long after, Altman led a $156 million Series C investment in the company. TrialSpark’s valuation would pop to $1 billion by the time the round closed.
“When donors give, and then benefit from their donations, arguably they are not promoting the public good, but rather their own good,” says Patricia Illingworth, a philosophy professor at Northeastern University and author of Giving Now, a book about the ethics of philanthropy. “I am reminded of the practice of parents donating to the schools their children attend. The donation has an element of self-dealing to it.”
TrialSpark provided the following statement: “We wound down Project Covalence as vaccines and therapies were authorized and approved. We had no concerns about OpenResearch’s contribution to Project Covalence and Sam’s investment in TrialSpark because they are two separate things.”
Altman could not be reached for comment, but a spokesperson for OpenResearch supplied a a statement along similar lines: “Project Covalence was part of a number of efforts during the pandemic, a project that the OpenResearch board felt would be beneficial to the public at that time. It is important to note that Project Covalence is different from TrialSpark.”
A press release issued by TrialSpark itself in July 2020 described Project Covalence as a platform of TrialSpark.
AI vs jobs
By 2020, OpenResearch had largely abandoned its work on user interfaces and virtual reality. Aside from its one-off grant to TrialSpark, OpenResearch’s attention and funds would now be dedicated to UBI research.
In a lengthy 2021 essay, Altman predicted that AI technologies might be able to pay every American $13,500 a year by 2031, and “that dividend could be much higher if AI accelerates growth.” Last year, he tweeted in favor of a $25 minimum wage: “I think it’s good to force the issue on automating jobs we aren’t willing to pay that much for anyway. Long term, I still think this is all the wrong framing and we will probably need something like UBI.”
And he was ready to put his nonprofit’s money where his mouth was.
Altman drew funds in 2021 from OpenAI and made a $75,000 grant to OpenResearch to work on UBI. That work involves designing and evaluating UBI programs, and advising other groups.
It makes sense that Altman turned to OpenAI to fund other projects. After all, OpenAI has had no difficulty in attracting donors. By 2018, it had raked in nearly $100 million to fund research projects into AI gaming, training a dexterous robot hand, organizing machine conferences and building out its AI safety team. But it had yet to make any external grants. The same year, Musk surrendered his board seat, citing possible conflicts of interests with Tesla’s AI efforts.
In 2019, most of OpenAI’s 125 employees transferred over to a new for-profit business, confusingly also called OpenAI, that would seek to commercialize the technologies it had developed, including the GPT large language models and text-to-image generators. Microsoft invested $1 billion, alongside other investors and VCs.
But the original nonprofit still had $30 million in the bank. With its AI technologies spun off, it now started to make grants, starting with modest contributions to organizations such as the ACLU, Black Girls Code, and Campaign Zero — a nonprofit seeking to end police violence.
Then in 2020, the original OpenAI gave away $10 million, nearly one third of its assets, in a previously unreported donation to a nonprofit called UBI Charitable, launched that same year. UBI Charitable does not have a website, or any salaried employees or volunteers, and its address is identical to that of OpenResearch.
A tax filing with the state of California reveals that UBI Charitable’s “primary and only currently planned activity will be grant-making to organizations that run universal basic income programs, and other policies and programs aimed at broadly distributing the benefits of technological advancement.”
UBI Charitable’s president and treasurer is Altman’s long-time friend and ex-Mountain View mayor, Chris Clark. Clark is also director of OpenResearch, as well as head of strategy at OpenAI itself. UBI Charitable’s only other income in 2020, a $15 million donation, came via a donor advised fund that protects its originator’s identity. It received another $5.3 million in 2021.
UBI Charitable started spending almost immediately. Since 2020, it has given $8.3 million to CitySquare, an anti-poverty charity in Dallas, and another $8.2 million to Heartland Alliance, a similar organization in Chicago that is already running a UBI pilot, called Chicago Resilient Communities. At the end of 2021, the most recent year for which tax records are available, UBI Charitable was sitting on assets of nearly $15 million.
Fixing the problem it created
The ethics of both funding AI, a technology that could lead to job losses, and providing for people whose livelihoods it threatens, are undeniably complex.
AI technology itself can see two sides to Altman’s actions. When TechCrunch asked ChatGPT, it noted: “If the entrepreneur’s nonprofit is creating a tool that could lead to job loss, he or she may be seen as having a responsibility to mitigate the harm that could result. By funding another nonprofit to provide support for those who may lose their jobs, the entrepreneur may be seen as fulfilling this responsibility.”
However, the AI system went on: “If the entrepreneur’s actions are motivated by a desire to protect his or her financial interests, rather than a genuine concern for those who may be impacted by the tool, this could be seen as a conflict of interest and potentially unethical.”
Of course, no one should rely on legal or ethical advice from a chatbot, and as Illingworth notes: “We want billionaires to give away their money as fast as they can.”
Whether Altman is trying to stay ahead of a coming technological tsunami, covering his ass, or some combination of the two, the net result is still millions of dollars being funneled to people in financial need. What remains to be seen is whether Altman’s UBI charity keeps pace with the changes AI seems likely to bring, and the profits ChatGPT seems likely to generate, in the years to come.