Tech
Open AI Chief Executive Officer Sam Altman
(Kim Jae-Hwan/Getty Images)

OpenAI’s o1-pro is the most expensive AI model in the industry

No other model by a major AI company comes even close.

OpenAI just released the pricing for its o1-pro reasoning model, which draws on an insane amount of computing power to use multistep “reasoning” through problems to get better responses to prompts. This computing power doesn’t come cheap: the new pricing is the highest for any major model in the industry today, and by a lot.

As a regular human user, you can use a lot of AI tools for free, but maybe you pay $20 per month for OpenAI’s ChatGPT Plus or Google’s Gemini Advanced if you use it a lot. But that’s not where the money is.

When companies are hooking their services up to AI platforms behind the scenes via an API (application programming interface), the costs can really add up. So what is the standard unit of measure for AI costs?

API pricing for AI models is measured by how much data (words, images, video, audio) you put into a model and how much data gets spit back out to you. The output costs more than the input.

The common measure for this is 1 million “tokens.” In AI parlance, a “token” is like an atomic unit of data. When text is input into a model, the words and sentences get broken down into these tokens for processing, which could be a few letters. For OpenAI’s models, one token is roughly four characters in English. So a paragraph is about 100 tokens, give or take.

For a million tokens, think Robert Caro’s epic biography of Robert Moses, “The Power Broker” — which I’m currently halfway through — a 2.3-pound, 1,300-page beast of a book. A rough estimate of this tome comes out to about 850,000 tokens.

If you put 1 million tokens into some of the leading models today, you could probably pay for it with just a few coins. For OpenAI GPT-4o Mini, the input would cost you only $0.15, while the output would cost $0.60. Google’s Gemini 2.0 Flash would cost you a single penny for the input and $0.04 for the output.

OpenAI o1-pro’s pricing for 1 million tokens of input is $150, and $600 for the output.

In a tweet announcing the pricing, OpenAI wrote, “It uses more compute than o1 to provide consistently better responses.”

It’s worth pointing out that there are huge differences in the capabilities of these models — some are very small and built for specific use cases like running on a mobile device, and others are massive for advanced tasks, so differences in prices are to be expected. But as you can see from the chart, OpenAI’s pricing stands apart from the crowd.

Pricing is a key issue for OpenAI as it struggles to find a viable business model to cover the enormous costs of running these services. The company’s recent pivot to release only “reasoning” models like o1-pro going forward means much higher computing costs, as evidenced by the cost of solving individual ARC-AGI puzzles for $3,400 apiece.

Recently, The Information reported that OpenAI was considering charging $20,000 per month for “PhD-level agents.”

CEO Sam Altman said in January that OpenAI is losing money on its ChatGPT Pro product.

The company is reportedly raising money at a valuation of $340 billion, and in 2024 it was reported to have lost about $5 billion, after bringing in only $3.7 billion in revenue.

More Tech

See all Tech
tech

Humanoid robot maker Apptronik raises $520 million

Apptronik, an Austin, Texas-based robot manufacturer, said it has closed out its Series A fundraising round, raising $520 million. The fundraising is an extension of a $415 million round raised last February, and included investments from Google, Mercedes-Benz, AT&T, and John Deere. Qatar’s state investment firm, QIA, also participated in the fundraising round.

Apptronik makes Apollo, a humanoid robot targeted for warehouse and manufacturing work. The company is one of several US robotics companies that are racing to apply generative-AI breakthroughs to humanoid robots, in anticipation of a new market for robots in homes and workplaces.

Apptronik makes Apollo, a humanoid robot targeted for warehouse and manufacturing work. The company is one of several US robotics companies that are racing to apply generative-AI breakthroughs to humanoid robots, in anticipation of a new market for robots in homes and workplaces.

tech

Ives: Microsoft and Google’s giant capex plans are worth it

Don’t mind the AI sell-off, says Wedbush Securities analyst Dan Ives, who thinks fears around seemingly unfettered Big Tech capex budgets are unfounded, especially in the case of Microsoft and Google. Together, the two hyperscalers are slated to spend around $300 billion on the purchases of property and equipment this year as they double down on AI infrastructure, but he says both have already shown that they can turn the spending into revenue and growth.

“They are reshaping cloud economics around AI-first workloads that carry higher switching costs, deeper customer lock-in, and longer contract durations than before,” Ives wrote, adding that these giant costs will be spread out over time and set the companies up for success in the long run. Per Ives:

“While near-term free cash flow optics remain noisy, the platforms that invest early and at scale are best positioned to capture durable share, pricing power, and ecosystem control as AI workloads mature. Over time, we expect utilization leverage to turn today’s elevated investment into a meaningful driver of long-term value creation.”

“They are reshaping cloud economics around AI-first workloads that carry higher switching costs, deeper customer lock-in, and longer contract durations than before,” Ives wrote, adding that these giant costs will be spread out over time and set the companies up for success in the long run. Per Ives:

“While near-term free cash flow optics remain noisy, the platforms that invest early and at scale are best positioned to capture durable share, pricing power, and ecosystem control as AI workloads mature. Over time, we expect utilization leverage to turn today’s elevated investment into a meaningful driver of long-term value creation.”

tech
Jon Keegan

Meta reportedly expands Hyperion data center site, purchasing an additional 1,400 acres

Construction is humming along on at Meta’s gargantuan Hyperion data center in Richland Parish, Louisiana.

And Meta is seemingly already moving ahead with plans to greatly expand the site.

A new report from Forbes revealed that Meta has purchased an additional 1,400 acres adjacent to the construction site, increasing the overall size of the project by 62%. The massive size of the site is nearly 5 miles long and 1 mile wide.

Meta CEO Mark Zuckerberg has said that the site “will be able to scale up to 5GW over several years.”

Meta CEO Mark Zuckerberg has said that the site “will be able to scale up to 5GW over several years.”

$290K
Rani Molla

Tesla has been quoting the price of its long-awaited long-range Semi truck at $290,000, Electrek reports. The $290,000 price point represents a significant increase from the original $180,000, roughly 60% higher. However, it’s still well below the industry average for Class 8 electric semi trucks. California Air Resources Board data shows that the average cost of a zero-emission Class 8 truck was $435,000 in 2024, meaning Tesla is undercutting competitors by about $145,000.

On its last earnings call, Tesla said it would start production on the “designed for autonomy” electric commercial truck this year.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.