#185 The iPhone 16 Series: Shiny New Gold, But Where’s the AI Spark?
Fresh & Hot curated AI happenings in one snack. Never miss a byte 🍔
This snack byte will take approx 4 minutes to consume.
On September 9th, Apple’s much-anticipated “It’s Glowtime” event lit up the tech world as it unveiled the iPhone 16 series. The star of the show? Siri, Apple’s revamped voice assistant, now sporting AI enhancements under the flashy name "Apple Intelligence."
Oh, and for those who prefer their tech with a splash of color, the iPhone 16 Pro comes in a new “desert titanium”—which is just a fancy way of saying gold.
But while the color is all glitz, the new iPhone features? Not quite glowing yet.
CEO Tim Cook hyped up the new generative AI features—teased earlier in June—but left us all with a bit of a cliffhanger. The AI magic won’t fully roll out until at least October.
Meanwhile, the iPhone’s AI-powered camera will soon be able to identify restaurant menus (something your foodie friend has likely been doing manually for years). And Siri? You’ll be able to type at it. Yes, groundbreaking stuff.
Apple’s investors are hopeful that eventually, more conversational AI features will kick in, potentially giving iPhone sales a much-needed boost. After all, iPhone sales account for nearly half of Apple’s revenue, but recently, they’ve been sagging harder than an old mattress. That said, the market for AI-powered smartphones is still anyone's game.
Generative AI: Beyond the Cloud and Into Your Pocket
Apple is just one of many tech giants trying to move AI from massive data centers (the cloud) to our personal devices (the edge). Samsung beat Apple to the punch, releasing its Galaxy S24 with some AI goodies earlier this year. Microsoft, not one to be left out, has also thrown its hat in the ring with AI-infused Windows PCs, called Copilot+. But the race is still wide open.
The challenge?
Most LLMs, like OpenAI’s ChatGPT, require mind-boggling amounts of computational power and energy.
Running them is so expensive that it reportedly costs OpenAI a hefty $0.36 each time someone asks their bot a question. That adds up quickly when millions are asking ChatGPT to explain quantum physics or draft awkward work emails.
Edge devices, like your iPhone or Android, need to run smaller, more efficient AI models distilled from their larger cloud counterparts. These smaller models are cheaper and faster. The goal is to minimize latency to the point where interacting with your AI feels as natural as texting your best friend. Plus, edge AI can learn more about you from your own device (which Apple calls “semantic indexing”), so it’s like having a personal assistant that actually knows your coffee order.
Running AI on Devices: Easier Said Than Done
Despite all the buzz, running AI on smaller devices isn’t without its hurdles. First off, performance. Complex tasks—like planning an elaborate vacation—are still better handled by cloud-based LLMs, not something your phone's processor is quite ready for.
Then there's the battery life problem. Even smaller AI models are power-hungry, and no one wants their phone to die mid-conversation with their AI assistant.
So what’s the plan?
Apple Intelligence will handle simpler tasks on-device, but when the going gets tough, queries will be sent to Apple’s private cloud. If things get really tricky, third-party LLMs like ChatGPT may step in—though privacy-conscious users might raise an eyebrow at their data being shuttled back and forth like this.
Smartphones already know an uncomfortable amount about us, from who we call to what we buy. If our AI assistants start using this data, users might prefer it stays on the device.
To make edge AI more efficient, companies are exploring alternatives to GPUs, which currently dominate AI processing but can be energy guzzlers.
Enter Neural Processing Units (NPUs), which are designed to run AI models on the edge more efficiently. Qualcomm, a key player in the chip game, is leading the charge with NPUs that focus on maximizing “performance per watt.”
In short, the idea is to squeeze out AI processing power without draining your phone’s battery—or your wallet.
The Future of Edge AI: Up for Grabs
With cloud-based LLMs heavily reliant on Nvidia’s powerful GPUs, the shift to edge AI represents an exciting opportunity for new players. “There’s nobody that dominates edge AI yet,” says Taner Ozcelik, CEO of Mythic, a startup specializing in energy-efficient AI chips.
This means companies like Apple, Qualcomm, and others could win big by mastering AI on edge devices.
Neil Shah from Counterpoint Research believes that the arrival of edge AI could spark a supercycle in device sales, not just for smartphones but also for a whole ecosystem of AI-driven apps and digital advertising.
However, we’re not quite there yet. As fun as the “Glowtime” marketing may be, the reality is that edge AI still needs more time to fully live up to the hype.
So, what’s next for Apple’s AI-driven iPhone 16? The shift toward edge AI is promising, and Apple’s positioning could potentially reignite iPhone sales.
But until Apple Intelligence is fully operational and the market for edge AI matures, we’ll have to wait and see if the glow is real—or just another shiny marketing gimmick.
For now, enjoy your “desert titanium” iPhone 16, and maybe hold off on asking Siri to plan that vacation. She might need a little more time to catch up.
You might be better off calling your friendly neighbor instead.