Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

A nonprofit group is joining Elon Musk’s efforts to block OpenAI’s commercial transition


Encode, a California-sponsored nonprofit unhappy SB 1047 Elon Musk asked for permission to file an amicus brief in support of AI security legislation command Stopping OpenAI’s transition to a commercial company.

a proposed short In a filing Friday afternoon in the U.S. District Court for the Northern District of California, Encode’s attorney said that turning OpenAI into a for-profit organization would “defraud” the firm’s mission to “develop and deploy transformative technology in a safe and beneficial way.” to the public”.

“OpenAI and its CEO, Sam Altman, claim to be developing technology that is transforming society, and those claims should be taken seriously,” the brief said. “If the world is really on the threshold of a new century artificial general intelligence (AGI)then the public has a strong interest in having this technology managed by a public charity legally bound to prioritize safety and public benefit rather than an organization focused on generating financial returns for a few privileged investors.

In a statement, Sneha Revanur, founder and president of Encode, accused OpenAI of “bringing (AI’s) profits in, but excluding the consequences for all of humanity,” saying, ”

Encode’s brief was supported by artificial intelligence pioneer and 2024 Nobel laureate Geoffrey Hinton and UC Berkeley professor of computer science and director of the Center for Human-Friendly Artificial Intelligence Stuart Russell.

“OpenAI was founded as an explicitly security-focused non-profit organization and made various security promises in its charter,” Hinton said. press release. “It received many tax and other benefits from its non-profit status. Letting them tear it all up when it’s inconvenient sends a very bad message to other actors in the ecosystem.”

OpenAI was launched in 2015 as a non-profit research laboratory. But as his experiments became increasingly capital intensive, he was created taking existing structure, outside investment from VCs and companies, including Microsoft.

Today, OpenAI has a hybrid structure: a commercial side run by a non-profit organization with a “limited profit” share for investors and employees. But a blog post This morning, the company said it plans to convert its existing earnings target to Delaware Public Benefit Corporation (PBC), its common stock and common stock. The OpenAI mission as public interest.

OpenAI will remain a non-profit organization, but will cede control in exchange for stakes in PBC.

Musk, an early contributor to the original nonprofit, filed a lawsuit in November seeking an injunction to halt the proposed change, which has long been in the works. He accused OpenAI of abandoning its original philanthropic mission of making the fruits of its AI research publicly available and of depriving its rivals, including its AI startup xAI, of capital through anti-competitive means.

There is OpenAI he called Musk’s complaints are “baseless” and just a matter of sour grapes.

Facebook’s parent company and artificial intelligence rival Meta also supports efforts to prevent OpenAI’s transformation. Meta in December sent In a letter sent to California Attorney General Rob Bonta, he argued that allowing the change would have “seismic implications for Silicon Valley.”

Encode’s lawyers said that OpenAI’s plans to hand over control of its operations to the PBC would require “an organization bound by law to ensure the safety of advanced AI, balancing any consideration of public benefit” with financial gain. will turn it into an organization. interests of (its) shareholders.’”

For example, OpenAI’s nonprofit has pledged to stop competing with any “value-aligned, security-focused projects” that come close to building AGI before it, but OpenAI’s commercial drive to do so is more difficult, the Encode consultant notes in the brief. little (if any) stimulus. The brief also notes that nonprofit OpenAI’s board will no longer be able to liquidate investors’ capital if necessary for safety once the company’s restructuring is complete.

OpenAI continues to experiment one stream of high level talent in part because of concerns that the company was prioritizing commercial products at the expense of security. One former employee, Miles Brundac, a longtime policy researcher who left OpenAI in October, said in a statement. series of posts At X, he said he was concerned that OpenAI was becoming a “side thing” that would license the non-profit PBC to operate as a “normal company” without addressing potentially problematic areas.

“OpenAI’s purported fiduciary duty to humanity would evaporate because Delaware law is clear that PBC directors have no duty to the public,” Encode’s brief continued. “For a safety-focused, mission-constrained nonprofit to relinquish control of something so transformative to a for-profit entity with no binding commitment to safety at any cost would harm the public interest.”

Founded by Revanur in July 2020, Encode describes itself as a network of volunteers focused on ensuring the voices of younger generations are heard in conversations about the implications of artificial intelligence. In addition to SB 1047, Encode has contributed to various state and federal pieces of AI-related legislation, including legislation from the White House. AI Bill of Rights and President Joe Biden Execution order on AI.

December 30, 10:10 Pacific Updated with statements from Revanour and Hinton.


TechCrunch has an AI-powered newsletter! Register here to receive in your inbox every Wednesday.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *