WSJ: OpenAI is hitting a wall with GPT-5 training
After 18 months’ work and hundreds of millions of dollars’ worth of computing time training its next major foundational model, GPT-5, OpenAI seems to have hit a wall.
New reporting from The Wall Street Journal said that the company is not seeing the exponential leap in its next-gen model (known internally as “Orion”) that OpenAI researchers — and OpenAI investors — had expected.
The AI “scaling law” that has until now consistently delivered more powerful, more capable AI models by just feeding more into more expensive GPUs is showing signs of reaching a plateau. Researchers are scrambling to find reserves of fresh data to train the models, as most of the internet has already been harvested.
Much of the AI industry has followed this pattern of model development, so if the current approach is reaching its theoretical limits, it could shake up the power structure of the industry.
Companies like Meta, Amazon, xAI, Google, and others are spending billions of dollars on data centers powered by hundreds of thousands of specialized training GPUs, like Nvidia’s popular Hopper series. Investors have been promised continued leaps in AI technology in exchange for huge capital expenditures investing in computing infrastructure.
OpenAI just announced its new o3 “reasoning” models, which the company is hoping will help break through the current barriers.
The AI “scaling law” that has until now consistently delivered more powerful, more capable AI models by just feeding more into more expensive GPUs is showing signs of reaching a plateau. Researchers are scrambling to find reserves of fresh data to train the models, as most of the internet has already been harvested.
Much of the AI industry has followed this pattern of model development, so if the current approach is reaching its theoretical limits, it could shake up the power structure of the industry.
Companies like Meta, Amazon, xAI, Google, and others are spending billions of dollars on data centers powered by hundreds of thousands of specialized training GPUs, like Nvidia’s popular Hopper series. Investors have been promised continued leaps in AI technology in exchange for huge capital expenditures investing in computing infrastructure.
OpenAI just announced its new o3 “reasoning” models, which the company is hoping will help break through the current barriers.