What the competition — Nvidia, OpenAI, Microsoft — is saying about DeepSeek
News of DeepSeek’s AI models pummeled American AI stocks yesterday. Leaders of the competition were quick to comment on how DeepSeek was seemingly able to do more for so much less.
deepseek's r1 is an impressive model, particularly around what they're able to deliver for the price.
— Sam Altman (@sama) January 28, 2025
we will obviously deliver much better models and also it's legit invigorating to have a new competitor! we will pull up some releases.
While OpenAI’s CEO Sam Altman struck a genial tone about how DeepSeek’s ingenuity was impressive and welcome competition, Nvidia was a little more backhanded, suggesting that DeepSeek just did the easy part:
“DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling. DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully export control compliant. Inference requires significant numbers of NVIDIA GPUs and high-performance networking. We now have three scaling laws: pre-training and post-training, which continue, and new test-time scaling.”
Microsoft CEO Satya Nadella suggested on LinkedIn that DeepSeek is sort of a rising tide that will raise all ships, invoking Jevons Paradox, the idea that increased efficiency would drive demand.
Tesla CEO Elon Musk, for his part, seems to have gone the sour grapes route, agreeing with Scale AI CEO Alexandr Wang that DeepSeek probably has more high-powered Nvidia chips than they’re letting on.
Obviously
— Elon Musk (@elonmusk) January 27, 2025