Tech
tech

AI researchers trained an OpenAI competitor in 26 minutes for less than $50

Researchers at Stanford and the University of Washington have developed an AI model that could compete with Big Tech rivals — and trained it in 26 minutes for less than $50 in cloud compute credits.

In a research paper published last Friday, the new “s1” model demonstrated similar performance on tests measuring mathematical problem-solving and coding abilities to advanced reasoning models like OpenAI’s o1 and DeepSeek’s R1.

Researchers said that s1 was distilled from “Gemini 2.0 Flash Thinking Experimental,” one of Google’s AI models, and that they used “test-time scaling” — or, presenting a base model with a dataset of questions and giving it more time to think before it answers. While this technique is widely used, researchers attempted to achieve the “simplest approach” through a process called supervised fine-tuning, where the model is explicitly instructed to mimic certain behaviors.

In the paper, the researchers discuss using simple commands like “wait”:

“...by appending Wait’ multiple times to the model’s generation when it tries to end. This can lead the model to double-check its answer, often fixing incorrect reasoning steps.”

With their methodology, the researchers report using a relatively small dataset on an off-the-shelf base model to cheaply recreate an AI model’s “reasoning” abilities. Now, the s1 model, along with the data and code used to train it, is on GitHub… which will, presumably, not best please big AI companies. (It was only days ago that OpenAI accused DeepSeek of ripping off ChatGPT to train its models.) Indeed, the mounting concern about unauthorized distilling has given rise to the word “distealing” among the AI community.

The researchers said that the fine-tuning was done on 16 H100 GPUs from Nvidia.

Researchers said that s1 was distilled from “Gemini 2.0 Flash Thinking Experimental,” one of Google’s AI models, and that they used “test-time scaling” — or, presenting a base model with a dataset of questions and giving it more time to think before it answers. While this technique is widely used, researchers attempted to achieve the “simplest approach” through a process called supervised fine-tuning, where the model is explicitly instructed to mimic certain behaviors.

In the paper, the researchers discuss using simple commands like “wait”:

“...by appending Wait’ multiple times to the model’s generation when it tries to end. This can lead the model to double-check its answer, often fixing incorrect reasoning steps.”

With their methodology, the researchers report using a relatively small dataset on an off-the-shelf base model to cheaply recreate an AI model’s “reasoning” abilities. Now, the s1 model, along with the data and code used to train it, is on GitHub… which will, presumably, not best please big AI companies. (It was only days ago that OpenAI accused DeepSeek of ripping off ChatGPT to train its models.) Indeed, the mounting concern about unauthorized distilling has given rise to the word “distealing” among the AI community.

The researchers said that the fine-tuning was done on 16 H100 GPUs from Nvidia.

More Tech

See all Tech
tech

Dan Ives thinks Tesla will someday merge with SpaceX, too

Wedbush Securities analyst Dan Ives is just like us: he thinks that Elon Musk’s Tesla and SpaceX could someday become one company.

In a note this morning, Ives argued there’s a “growing chance” Tesla will eventually merge in some form with newly merged SpaceX and xAI, as Musk builds what he sees as a single, sprawling AI ecosystem spanning both space and Earth.

Over time, Ives wrote, he thinks Musk will look to “combine forces/technologies,” with the long-term goal of owning and controlling more of the AI stack. Ives thinks Musk could achieve that “holy grail” over the next year and a half.

Earlier today, we pointed out the myriad similarities between Tesla and SpaceX — shared impossible missions, common methods for achieving those goals, and a physics-first, economics-later ethos — as well as Musk’s long-standing penchant for knitting his companies together in the first place.

Over time, Ives wrote, he thinks Musk will look to “combine forces/technologies,” with the long-term goal of owning and controlling more of the AI stack. Ives thinks Musk could achieve that “holy grail” over the next year and a half.

Earlier today, we pointed out the myriad similarities between Tesla and SpaceX — shared impossible missions, common methods for achieving those goals, and a physics-first, economics-later ethos — as well as Musk’s long-standing penchant for knitting his companies together in the first place.

Elon Musk laughing

SpaceX merges with xAI, reportedly will seek an IPO valuation of $1.25 trillion

Elon Musk says his space company has merged with his AI company, with the lofty goal of eventually putting data centers in space.

tech

Analyst: Investors should brace for Europe’s breakup with US Big Tech

The signs are there: the French government has restricted the use of Zoom for its employees. In Germany, the state of Schleswig-Holstein is ending the use of Microsoft Teams among its workers.

As US-EU tensions rise, Europe is looking to secure its own “digital sovereignty,” reduce its dependence on US-owned technology platforms, and grow its domestic tech industry. It now seems the European breakup with Big Tech is underway.

Tuttle Capital Management CEO Matthew Tuttle thinks that most investors aren’t paying enough attention to this growing problem for the American tech sector’s stocks.

In a note to investors, Tuttle wrote:

“The world is building optionality away from U.S. policy and platform dependence. And once you see it, you can’t unsee it — because it’s showing up in procurement decisions, supply chains, defense budgets, and capital flows.”

Tuttle Capital Management CEO Matthew Tuttle thinks that most investors aren’t paying enough attention to this growing problem for the American tech sector’s stocks.

In a note to investors, Tuttle wrote:

“The world is building optionality away from U.S. policy and platform dependence. And once you see it, you can’t unsee it — because it’s showing up in procurement decisions, supply chains, defense budgets, and capital flows.”

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.