Tech
GPT-5.1 screenshot
(OpenAI)

OpenAI releases GPT 5.1, which can be “Professional,” “Candid,” or “Quirky”

The new “more conversational” model follows instructions better, but backslides on some safety tests.

Jon Keegan

Today OpenAI released GPT 5.1, an update that aims to make ChatGPT “more conversational.” The model comes in two versions: GPT-5.1 Instant (“now warmer, more intelligent, and better at following your instructions”) and GPT-5.1 Thinking (“now easier to understand and faster on simple tasks, more persistent on complex ones”).

Despite an earlier update this year that was rolled back due to being overly sycophantic, the new model responds in more chummy conversation that the company says “surprises people with its playfulness” in testing.

Users now have finer control over ChatGPT’s “personality,” with new settings for “Professional,” “Candid,” and “Quirky.”

In the model’s system card, OpenAI details how well the new 5.1 models compare to the earlier 5.0 models on internal benchmarks for disallowed content.

The company has said it is prioritizing the addition of new checks to help users who may be suffering a mental health crisis, after a series of alarming incidents where ChatGPT encouraged self-harm and reinforced delusional behavior.

Two new tests were included with this release for the first time: “mental health” and “emotional reliance.” GPT 5.1 Thinking actually scored slightly lower on 9 of 13 testing categories than its predecessor, GPT-5 Thinking, and GPT-5.1 Instant scored lower than GPT-5 Instant on 5 of 13 tests.

More thinking, more tokens

OpenAI says that GPT-5.1 Thinking now spends less time on simple tasks and more time on difficult problems. This is measured by the number of model-generated tokens (tiny bits of text). Based on a chart in the announcement, the very toughest queries handled by GPT-5.1 Thinking will use 71% more tokens to complete the query. That’s a lot more tokens, and a lot more computing!

All those tokens can add up. Every time OpenAI’s customer-facing models gobble up more computing cycles, it spends more on “inference,” or running the models (as opposed to the more resource-intensive training process that happens while building the models). When enterprise customers use OpenAI’s API to use the models, the customer pays by the token count, but free users using the chat interface do not.

As a private company, OpenAI’s finances aren’t public, but a new report from the Financial Times raises the question of how much all these “thinking” models are costing the company. While The Information recently reported that OpenAI spent $2.5 billion in the first half of 2025, AI skeptic, podcaster, and writer Ed Zitron told the FT he has seen internal OpenAI figures showing that OpenAI’s cash burn for the first half of the year was much higher — close to $5 billion.

To satisfy the $1 trillion in recent deals it has signed on to, OpenAI will need to find a way to generate more revenue.

More Tech

See all Tech
tech

Trump to push Big Tech to fund new power plants as AI drives up electricity costs

President Donald Trump is expected to announce a plan Friday morning that would require Big Tech companies to bid on 15-year contracts for new electricity generation capacity. The move would effectively force companies to help fund new power plants in the PJM region as soaring demand from AI data centers pushes up electricity costs across the US power grid.

Earlier this week, Trump called on tech giants to “pay their own way,” arguing that households and small businesses should not bear the cost of power infrastructure needed to support energy-hungry data centers.

Microsoft quickly responded, saying it would “pay utility rates that are high enough to cover our electricity costs,” along with committing to other changes aimed at easing pressure on the grid. Other major tech companies are expected to follow suit, though Wedbush Securities analyst Dan Ives warned the added costs could slow the pace of data center build-outs.

As we’ve noted, forcing tech companies to shoulder higher electricity costs is likely to hit some firms harder than others. Companies like Microsoft, Google, and Amazon can pass at least some of those costs on to customers by selling data center capacity downstream. Meta, in contrast, does not have a cloud business, meaning its AI ambitions lack a direct revenue stream to offset rising power costs.

So far tech stocks don’t appear to be affected much in premarket trading. However utility companies most levered to the AI boom certainly are, with Vistra, Constellation Energy, and Talen Energy deep in the red ahead of the open as analysts at Jefferies warn that these firms face risks from this plan.

Earlier this week, Trump called on tech giants to “pay their own way,” arguing that households and small businesses should not bear the cost of power infrastructure needed to support energy-hungry data centers.

Microsoft quickly responded, saying it would “pay utility rates that are high enough to cover our electricity costs,” along with committing to other changes aimed at easing pressure on the grid. Other major tech companies are expected to follow suit, though Wedbush Securities analyst Dan Ives warned the added costs could slow the pace of data center build-outs.

As we’ve noted, forcing tech companies to shoulder higher electricity costs is likely to hit some firms harder than others. Companies like Microsoft, Google, and Amazon can pass at least some of those costs on to customers by selling data center capacity downstream. Meta, in contrast, does not have a cloud business, meaning its AI ambitions lack a direct revenue stream to offset rising power costs.

So far tech stocks don’t appear to be affected much in premarket trading. However utility companies most levered to the AI boom certainly are, with Vistra, Constellation Energy, and Talen Energy deep in the red ahead of the open as analysts at Jefferies warn that these firms face risks from this plan.

tech

OpenAI working to build a US supply chain for its hardware plans, including robots

When OpenAI purchased Jony Ive’s I/O, it entered the hardware business. The company is currently ramping up to produce a mysterious AI-powered gadget.

But OpenAI plans on making more than just consumer gadgets — it also plans on making data center hardware, and even robots.

Bloomberg reports that OpenAI has been on the hunt for US-based suppliers for silicon and motors for robotics, as well as cooling systems for data centers.

AI companies are looking toward robots as a logical next step for finding applications for their models.

OpenAI told Bloomberg that US companies building the AI brains of robots might have an edge against the Chinese hardware manufacturers that are currently making some impressive humanoid robots.

Bloomberg reports that OpenAI has been on the hunt for US-based suppliers for silicon and motors for robotics, as well as cooling systems for data centers.

AI companies are looking toward robots as a logical next step for finding applications for their models.

OpenAI told Bloomberg that US companies building the AI brains of robots might have an edge against the Chinese hardware manufacturers that are currently making some impressive humanoid robots.

tech

ICE agents arrest workers from Meta’s Hyperion data center site

Yesterday, US Immigration and Customs Enforcement (ICE) officers stopped and arrested two workers from Meta’s massive Hyperion data center construction site in Richland Parish, Louisiana.

According to the Richland Parish Sheriff’s Office, two dump truck drivers were stopped and arrested as part of a traffic stop as they headed to the construction site where thousands of people are working.

Bloomberg reports that unmarked vehicles at the perimeter of the construction site were stopping and checking the identification of workers. The Sheriff’s Office said ICE agents did not enter the Meta site at any time.

Bloomberg reports that unmarked vehicles at the perimeter of the construction site were stopping and checking the identification of workers. The Sheriff’s Office said ICE agents did not enter the Meta site at any time.

tech

Two cofounders leave Thinking Machines Lab to return to OpenAI

A group of researchers have left Mira Murati’s Thinking Machines Lab to go back to OpenAI. Fidji Simo, OpenAI’s head of apps, posted on X that Thinking Machines cofounders Barret Zoph and Luke Metz, along with Sam Schoenholz, will be returning to the company.

In October, Thinking Machines cofounder Andrew Tulloch left to work for Meta.

Thinking Machine Labs was cofounded by Murati, a former OpenAI executive, and the startup has been raising large amounts of money, reportedly with a $50 billion valuation.

Thinking Machine Labs was cofounded by Murati, a former OpenAI executive, and the startup has been raising large amounts of money, reportedly with a $50 billion valuation.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.