Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The list of advanced AI models that have missed their promised launch windows continues to grow.
Last summer, billionaire Elon Musk, the founder and CEO of AI company xAI, said xAI’s next major AI model, the Grok 3, would arrive by the “end of the year” in 2024. GrokxAI’s answer to models like OpenAI GPT-4o and Google Twinscan analyze images and answer questions, and powers a number of features on Musk’s social network X.
“Grok 3 should be something really special later in the year after training on 100k H100s,” Musk said he wrote In an article in X in July, referring to XAI the giant Memphis cluster of GPUs. “The Grok 3 will be a huge leap forward,” he said he said in a follow-up post in mid-December.
Still, it’s January 2nd and the Grok 3 hasn’t arrived — nor are there any signs that its rollout is imminent.
In fact, some of the code on the xAI website was discovered by AI consultant Tibor Blaho offers An intermediate model called “Grok 2.5” may land first.
Grok(.)com is coming out with the Grok 2.5 model soon (grok-2-son – “Our smartest model”) – thanks for the tip, anon! pic.twitter.com/emsvmZyaf7
— Tibor Blaho (@btibor91) December 20, 2024
Granted, this isn’t the first time Musk has set a lofty goal and missed it. this well established Musk’s announcements about product launch times are often unrealistic at best.
But the Grok 3’s MIA status is interesting because it’s part of a growing trend.
Last year, artificial intelligence startup Anthropic failed to land a high-profile successor Close 3 Work model. Months later announces A new generation model, the Claude 3.5 Opus, will be released by the end of 2024, Anthropic broke all notes about the model in its developer documentation. (according to (According to one report, Anthropic finished training Claude 3.5 Opus last year, but decided it didn’t make economic sense to release it.)
Google and OpenAI are also reportedly there suffered failures with flagship models in recent months.
To a large extent, the culprit in all this limitations of existing AI scaling laws — methods companies use to increase the capabilities of their models. In the not-too-distant past, significant performance gains could be achieved by training models using large amounts of computing power and larger and larger datasets. But with each generation of the model, profits began to decline, and the companies were responsible for this to pursue alternative techniques.
There may be other reasons for Grok 3’s delay. xAI has a smaller team than many of its competitors. However, the off-the-shelf startup time proves that conventional AI training approaches are running against the wall.