Tech
Classroom Full of Children
Classroom Full of Children
Learning curve

Teenagers are using AI to learn math and science from celebrities

That’s called a tangent, Ariana

David Crowther

Few people can make as much of an impact on your life as a good teacher — the kind who makes learning delightful and fun, imparts valuable knowledge, and, above all else, prepares us for the world that awaits.

Of course, not all teachers are quite as engaging as we’d like them to be. In days gone by, tough luck. In 2024, students have a few more options thanks to the advent of generative AI, and the youth of today are taking full advantage, creating content that uses the likenesses of celebrities such as Morgan Freeman, Kim Kardashian, and Donald Trump, to learn about math and more.

Take this video from @onlocklearning on Instagram in which very-much-not-real versions of “Eminem” and “Ariana Grande” explain the concept of the exponential function.

Or this one from the same account — which has more than 566,000 followers — in which rapper “Cardi B” and Amazon founder “Jeff Bezos” answer the question that all teens are just dying to know the answer to: why (a+b)² = a² + 2ab + b².

These videos are part of a small but growing trend of educational clips created for a new crop of students who, for better or worse (probably worse?), are used to consuming vertical video content on their phones. They share many of the hallmarks of viral TikTok or Reels videos:

  1. Quick cuts and slick editing.

  2. Music, often remixed.

  3. Very high information density; the jokes come quick and fast, and so do the equations.

Child’s play

As with any emerging technology, watching how younger people actually use it is one of the best predictors of where it might go. Clearly, part of the appeal of these videos is that they are genuinely useful. The “script” is clearly written by a human, and the math concepts depicted are illustrated beautifully. But, there’s no denying that the gimmick of having AI celebrities, like Jenna Ortega and Barack Obama, explain the concepts, is core to the appeal on social media.

It’s not hard to imagine a well-funded company commodifying this type of content into a product. Indeed, many of the accounts appear to be creating the AI educational content simply as a marketing funnel for whatever tool helped to create the videos.

The more you know

Despite concerns about Gen AI tools spewing misinformation and getting basic facts wrong (like how many Rs are there in “strawberry”), the potential to create new learning tools with the technology, or just reduce the burden of creating learning resources for teachers, is a market potentially worth tens of billions of dollars.

Last year, Morgan Stanley estimated that “Generative AI could bring $200 billion in value to the global education sector by 2025”.

Interestingly, data from a new report out yesterday from Common Sense Media found that nearly 75% of teens have now experimented with at least one type of generative AI tool.

The biggest use case so far? Helping with homework.

Teen AI usage
Sherwood News

Whether kids getting help on their assignments from chatbots is actually a good thing for their learning is yet to be seen, but there’s a big difference between getting ChatGPT to write your history essay and enlisting fake Snoop Dogg to help you with calculus.

More Tech

See all Tech
tech
Jon Keegan

DeepSeek releases new V4 series models highlighting efficiency and long context

Chinese AI lab DeepSeek has released a major new version of its eponymous open-source AI models that are nipping at the heels of leading frontier models in some areas.

The most significant DeepSeek-V4 Pro and DeepSeek-V4 Flash both have a 1 million-token context — the amount of information the model can actively work with in a single session — which is a crucial feature for complex, long-running coding tasks.

DeepSeek rebuilt how the models process information under the hood, making them substantially more efficient — and that efficiency is what makes the large context window actually usable.

Also, the new models’ coding skills have closed the gap with the major frontier models from Anthropic, OpenAI, and Google.

The authors of the model acknowledge some of V4’s shortcomings, such as its lower scores on reasoning benchmarks, saying that V4 “trails state-of-the-art frontier models by approximately 3 to 6 months.”

As open-weight models, V4 can be run on any user’s own hardware, making the V4 models among the top-performing open-source models out there. V4’s large context and token efficiency are especially significant among open-source models.

But like with earlier DeepSeek models, don’t ask it about Tiananmen Square.

DeepSeek rebuilt how the models process information under the hood, making them substantially more efficient — and that efficiency is what makes the large context window actually usable.

Also, the new models’ coding skills have closed the gap with the major frontier models from Anthropic, OpenAI, and Google.

The authors of the model acknowledge some of V4’s shortcomings, such as its lower scores on reasoning benchmarks, saying that V4 “trails state-of-the-art frontier models by approximately 3 to 6 months.”

As open-weight models, V4 can be run on any user’s own hardware, making the V4 models among the top-performing open-source models out there. V4’s large context and token efficiency are especially significant among open-source models.

But like with earlier DeepSeek models, don’t ask it about Tiananmen Square.

$28.5T
Rani Molla

SpaceX thinks its total addressable market (TAM) is a whopping $28.5 trillion for its businesses, according to an S-1 filing for its upcoming IPO reviewed by Reuters. And most of that market isn’t rockets. The company says roughly 90% could come from AI — largely selling artificial intelligence tools to businesses.

“We believe that our enterprise strategy, which is focused on serving the digital needs of the world’s largest industries with Al solutions, positions us competitively to pursue this rapidly ⁠growing opportunity,” ​SpaceX said in the filing. “We believe we have identified the largest actionable total addressable market in human ​history.”

TAM, of course, assumes capturing every possible customer. But even a small slice of a $28.5 trillion market would be enormous.

tech
Rani Molla

Tesla Cybercab production has begun

On Tesla’s earnings call earlier this week, CEO Elon Musk said production of the company’s steering-wheel-less Cybercab had begun. Since then, Musk and Tesla have posted videos showing the gold two-seater rolling off the line at its Texas Gigafactory and onto the road.

The Cybercab — meant both for consumers and Tesla’s Robotaxi network — is widely seen as central to the company’s future. “The future of the company is fundamentally based on large-scale autonomous cars and large scale and large volume, vast numbers of autonomous humanoid robots,” Musk said last year.

Whether these cars actually make it to consumers is another question. For now, regulations generally require steering wheels, and Tesla still has to prove the vehicles can reliably drive themselves.

On the earnings call, Musk said production would be “very slow” but would ramp up and go “kind of exponential towards the end of the year and certainly next year.”

tech
Rani Molla

Meta signs deal to use Amazon Graviton chips

Meta said it will deploy “tens of millions” of Amazon Web Services Graviton CPU cores to power so-called “agentic” AI systems — tools that can reason, plan, and act on their own. The move makes Meta one of the largest customers of Amazon’s in-house chips.

The deal also underscores a broader shift in AI infrastructure, as companies move beyond Nvidia GPUs and use different chips for different tasks.

Meta, which is working on its own custom inference chips, also has chip deals with Advanced Micro Devices and Nvidia.

The deal also underscores a broader shift in AI infrastructure, as companies move beyond Nvidia GPUs and use different chips for different tasks.

Meta, which is working on its own custom inference chips, also has chip deals with Advanced Micro Devices and Nvidia.

tech
Rani Molla

Oracle rises after Wedbush’s Dan Ives calls the stock a buy with 25% upside

Oracle extended its premarket gains Friday after Wedbush Securities’ Dan Ives initiated coverage with an “outperform” rating and a $225 price target — about 25% upside to its pre-initiation level — calling the enterprise software and cloud infrastructure company a “foundational infrastructure provider for the AI revolution.”

Ives argues investors are misreading Oracle’s heavy capital spending and negative free cash flow as risky, despite being backed by a massive $553 billion backlog of contracted demand. He says the company’s “secret sauce” is a two-part strategy: building high-performance cloud infrastructure for AI workloads while connecting those models directly to companies’ own data.

“We believe Oracle is in the early innings of a significant repositioning as it executes on this generational opportunity,” Ives wrote.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, Robinhood Derivatives, LLC, or Robinhood Money, LLC. Futures and event contracts are offered through Robinhood Derivatives, LLC.