Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

AI firms increase federal lobbying spending in 2024 amid regulatory uncertainty


Companies spent more lobbying on AI issues at the US federal level last year than in 2023 amid regulatory uncertainty.

According to data compiled by OpenSecrets, 648 companies spent money on AI lobbying in 2024, compared to 458 companies in 2023, a 141% year-over-year increase.

Companies such as Microsoft have supported legislation such as the CREATE AI Act, which would support the benchmarking of artificial intelligence systems developed in the United States. .

The data shows that most AI labs—that is, companies dedicated almost exclusively to the commercialization of various kinds of AI technologies—spent more support on legislative agenda items in 2024 than in 2023.

OpenAI increased its lobbying spending last year to $1.76 million from $260,000 in 2023. OpenAI’s close competitor, Anthropic, increased its spending from $280,000 in 2023 to $720,000 last year, while enterprise-focused startup Cohere more than doubled its spending in 20202, from just $70,000 two years ago.

Both OpenAI and Anthropic have hired over the past year to coordinate outreach to policymakers. Anthropic brought on its first in-house lobbyist, Justice Department graduate Rachel Appleton, and hired OpenAI political veteran Chris Lehane as its new vice president of policy.

OpenAI, Anthropic and Cohere have committed a combined $2.71 million to federal lobbying initiatives in 2024. This is a small number compared to what the larger tech industry puts up over the same period ($61.5 million), but toward lobbying that was four times the total amount spent by the three AI labs in 2023 ($610,000).

TechCrunch reached out to OpenAI, Anthropic, and Cohere for comment, but did not receive a response at press time.

The past year has been a tumultuous one in domestic AI policymaking. In the first half alone, lawmakers in Congress considered more than 90 pieces of AI-related legislation. According to the Brennan Center. More than 700 laws have been proposed at the state level.

Congress made little progress, prompting state legislatures to move forward. Tennessee it happened the first state to protect voice artists from unauthorized AI cloning. Colorado has been accepted A tiered, risk-based approach to AI policy. And California Governor Gavin Newsom signed it with them AI-related security laws, some of which require AI companies to disclose their details training.

No government official has succeeded in adopting a regulation as comprehensive as international frameworks for artificial intelligence EU AI Acthowever.

After a long battle with special interests, Governor Newsom vetoed draft law SB 1047this would impose extensive security and transparency requirements on AI developers. Texas’ BRING IT an even broader bill could suffer the same fate once it passes the statehouse.

It’s unclear whether the federal government will be able to make more progress on AI legislation this year than last year, or whether there’s even a strong appetite for coding. President Donald Trump has signaled his intention to largely deregulate the industry, cleaning up what he sees as obstacles to US dominance in artificial intelligence.

On his first day in officeTrump was cancelled one executive order By former President Joe Biden, who sought to mitigate the risks AI could pose to consumers, workers and national security. On Thursday, Trump signed an EO ordering federal agencies to potentially halt some AI policies and programs from the Biden era, including Export rules in AI models.

Anthropic in November he called warning that “the window for proactive risk mitigation is rapidly closing” for “targeted” federal AI regulation over the next 18 months. In turn, OpenAI in its latest policy document called The U.S. government needs to take more serious measures regarding infrastructure to support the development of artificial intelligence and technology.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *