Tech
OpenAI CEO Sam Altman talks with Apple senior Vice President of Services Eddy Cue during WWDC 2024
(Justin Sullivan/Getty Images)
Platformer

Will Apple Intelligence make apps obsolete?

Apple’s AI moment arrives — and raises lots of questions about what this means for the future of its ecosystem.

Casey Newton

CUPERTINO — By the time Apple got around to talking about artificial intelligence on Monday, it had already delivered a full keynote’s worth of announcements about improvements to come across its proliferating array of operating systems. Here was a novel math app that solved your handwritten equations in real time; there was a feature that would let you operate your iPhone from your Mac.

These are the sort of welcome but modest improvements the company often shows off at its Worldwide Developer Conference: gently enhancing the user experience on their devices, but usually not offering anything that would send you running to the Apple Store to upgrade to a newer and more capable device.

Then, halfway through, the pre-recorded keynote arrived at the announcement Mark Gurman’s extraordinary reporting over the weekend had prepared us for: Apple Intelligence, a suite of operating system-level applications of AI that represent the company’s first major effort to integrate generative models into their product lineup. 

It was a moment that has seemed inevitable since November 2022, when ChatGPT launched and catalyzed global interest in how AI can enhance products. In the 18 months since, impatient investors have worried that Apple might be letting the moment pass it by. Savvier observers have noted that this is how Apple has worked for decades now: approaching new technologies deliberately, and on its own time; developing its distinctive take on the product; and releasing it only when polished to the company’s quality standards. 

Judging from the preview, Apple Intelligence was created in just this way. The company took time to develop principles around what AI should do on its devices. It landed on a suite of AI features for the operating system, designed to make its devices more valuable by leveraging the massive amount of personalized data on your devices. (Sensitive to the implications of such an invasive technology, Apple also took pains to develop a more private approach to data processing for AI apps as well.)

The question now is how polished those features will feel at release. Will the new, more natural Siri deliver on its now 13-year-old promise of serving as a valuable digital assistant? Or will it quickly find itself in a Google-esque scenario where it’s telling anyone who asks to eat rocks

Journalists were not offered a chance to try any of the new features today, nor could we even ask questions of any of the executives.

It will be some time before we know. Journalists were not offered a chance to try any of the new features today, nor could we even ask questions of any of the executives. (Instead we were herded into the Steve Jobs Theater to watch the YouTuber iJustine lob carefully vetted softballs at Apple executives Craig Federighi and John Giannandrea for a half-hour.)

As a result, for now we can’t answer how well it works. And so the most interesting question available to discuss is more like: what is all of this pointing to?

During the keynote, Federighi — Apple’s senior vice president of software engineering — laid out the company’s principles for AI. It should be helpful; it should be easy to use; it should be integrated into products you’re already using; and it should be personalized based on what it knows about you.

Much of what Apple showed off today has long been available for free in other apps. (Perhaps that’s why, as MG Siegler noted today, the company’s stock was actually down about 2 percent after the event.) You’ll be able to automatically generate text almost anywhere you can type in the operating system, Federighi said, whether that be in Pages, Mail, or a third-party app. Similarly, you can use text-to-image tools to create custom emoji or generate DALL-E style images using a new app called Image Playground. (Notably, Image Playground will not generate photorealistic images, likely in an effort to prevent misuse.)

Here the pitch is less about innovation than it is convenience. The present-day AI experience involves a lot of copying and pasting between apps; Apple Intelligence promises to do the work directly on your device and route the resulting data around the operating system for you.

But it’s in Federighi’s final principal — that AI should be personalized around what it knows about you — that Apple’s real advantage is apparent. It’s how the company distinguishes itself from (friendly) rivals like OpenAI or Anthropic, which at the moment offer you only a box to type into, and have limited memory of how you have used their chatbots. Apple can pull from your email, your message, your contacts, and countless other surfaces throughout the operating system, and — in theory — can draw from them to help you more easily navigate the world. 

Apple Intelligence also represents a chance to reboot Siri, its perpetually tin-eared and tone-deaf voice assistant. The company demonstrated Siri handling more difficult syntax than it did previously, and with a longer memory. It will also be able to control more parts of the operating system. 

Still, it was not nearly as impressive as what OpenAI showed off last month with its emotionally intelligent, low-latency (and hugely controversial) voice mode for ChatGPT. I was left wondering whether Apple might be open to a deeper partnership with OpenAI to improve Siri, or whether the company still hopes to catch up over time.

Speaking of OpenAI, that company did get some limited time on screen Monday. ChatGPT will be integrated into Siri, but somewhat halfheartedly: Siri will still endeavor to answer questions on its own, while routing only some queries to OpenAI. In the demo, Siri makes you tap to confirm that you are OK with Siri doing this — which might be sensible from a privacy standpoint, but feels deeply annoying as a user experience. (Why not just map your iPhone’s action button to ChatGPT’s voice assistant and bypass it altogether?)

In any case, while the Apple partnership clearly represented a win for OpenAI after a bruising few weeks, I was also struck at the degree to which Apple played down its significance during the keynote. Sam Altman did not appear in the keynote, though he was present at the event. And at the iJustine event, Federighi took the unusual step of saying that other models — including Google Gemini — would likely be coming to the operating system.

“We think ultimately people will have a preference for which models they want to use,” he said. 

The most vexing part of the new Siri, at least as it was shown, is not whether it works but what it can do. The company flashed a few screens of possibilities: add an address to a contact card; show certain very specific photos; “make this photo pop.” But how do you remember to do that in the moment? The invisible interface has always been the problem with voice assistants, and I wonder if Apple is doing enough to address it. (One employee said during the keynote that Siri can now answer thousands of questions about how to use Apple’s operating systems, so maybe that’s one way.) 

There were also — if you squinted — hints of a much different future for computing. In the demo that I found most compelling, an employee asked Siri “how long will it take to get to the restaurant” and the OS figured out the answer by consulting email, text messages, and maps to derive the answer. I’ve written a few times lately about how AI (or at least Google’s version of it) has put the web into a state of managed decline; today’s keynote raised the question of whether AI will induce a similar senescence in the app economy. 

It’s kind of a grim thought for a developer conference — which is perhaps why Apple did not dwell on it.


Casey Newton writes Platformer, a daily guide to understanding social networks and their relationships with the world. This piece was originally published on Platformer.

More Tech

See all Tech
tech

After Tesla earnings, prediction markets think unsupervised FSD is less likely than ever to be rolled out this year

Tesla’s unsupervised full self-driving technology, which would autonomously ferry passengers around without a human driver having to pay attention, is supposed to help catapult the electric vehicle company’s valuation further into the stratosphere. It was also supposed to be available this year, but prediction markets participants, as well as former Tesla self-driving leaders, no longer think that will happen.

On Teslas earnings call this week, CEO Elon Musk said the company now had “clarity” on achieving unsupervised full self-driving — something he’s repeatedly said would be available at least in some markets this year.

The comments seemed to give Polymarket prediction markets participants some clarity. There, the market-implied probability that Tesla will release unsupervised FSD this year reached its lowest point since the event contract was opened in May.

The odds of it happening had been pretty high up until late June, when Tesla’s long-awaited robotaxi launched with a safety driver in the passenger seat. The unsupervised FSD event contract specifies the feature can have “no requirement for human intervention.”

tech
Rani Molla

Banks prepare record $38 billion debt financing to fund Oracle-tied data centers

Banks led by JPMorgan and Mitsubishi UFJ are preparing a $38 billion debt offering to fund two Oracle-tied data centers in Texas and Wisconsin, Bloomberg reports. The projects, developed by Vantage Data Centers, will support Oracle’s $500 billion Stargate AI infrastructure push with OpenAI and Nvidia.

The loans — $23.25 billion for Texas and $14.75 billion for Wisconsin — are expected to mature in four years, price about 2.5 percentage points higher than the benchmark rate, and mark the largest AI infrastructure financing to date.

Oracle executives recently said that the company anticipates cloud gross margins will reach 35% and that it expects to see $166 billion in cloud infrastructure revenue by FY 2030.

Oracle is up 1.5% premarket.

The loans — $23.25 billion for Texas and $14.75 billion for Wisconsin — are expected to mature in four years, price about 2.5 percentage points higher than the benchmark rate, and mark the largest AI infrastructure financing to date.

Oracle executives recently said that the company anticipates cloud gross margins will reach 35% and that it expects to see $166 billion in cloud infrastructure revenue by FY 2030.

Oracle is up 1.5% premarket.

tech
Rani Molla

Google rises on official announcement of Anthropic deal worth “tens of billions”

Google has made its deal to expand AI compute to Anthropic, reported earlier this week by Bloomberg, official. In order to train and serve its Claude model, Anthropic has agreed to pay Google Cloud “tens of billions of dollars” to access up to 1 million tensor processing units, or TPUs, as well as other cloud services.

Google, of course, has a 14% stake in Anthropic, making this one of the many circular AI deals happening at the moment.

“Anthropic and Google have a longstanding partnership and this latest expansion will help us continue to grow the compute we need to define the frontier of AI,” Anthropic CFO Krishna Rao said in the press release. “Our customers — from Fortune 500 companies to AI-native startups — depend on Claude for their most important work, and this expanded capacity ensures we can meet our exponentially growing demand while keeping our models at the cutting edge of the industry.”

The announcement has sent Google up again, more than 1% premarket.

tech
Rani Molla

Report: Snap seeking $1 billion to finance its AR glasses division in “existential” fundraise

Snap is down more than 1% this morning following news that the company is attempting to raise $1 billion for its AR glasses unit in what someone told Sources.news was an “existential” fundraise.

A Snap spokesperson countered, “We do not need to raise money to execute against our plans to publicly launch Specs in 2026, but remain open to opportunities that could accelerate our growth.”

Multiple investors are involved in the talks, including Saudi Arabia’s Public Investment Fund, according to Sources.news. The report also noted that Snap plans to turn the unit that makes its Specs glasses into an independent subsidiary à la Google’s Waymo “that can continue raising capital from investors.”

Snap plans to produce about 100,000 units of next year’s Specs, pricing them around $2,500.

The beleaguered stock saw quite a bit of retail interest last month, amid r/WallStreetBets chatter that its low nominal price made it a potential acquisition target.

Multiple investors are involved in the talks, including Saudi Arabia’s Public Investment Fund, according to Sources.news. The report also noted that Snap plans to turn the unit that makes its Specs glasses into an independent subsidiary à la Google’s Waymo “that can continue raising capital from investors.”

Snap plans to produce about 100,000 units of next year’s Specs, pricing them around $2,500.

The beleaguered stock saw quite a bit of retail interest last month, amid r/WallStreetBets chatter that its low nominal price made it a potential acquisition target.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.