Siri’s limits mean it still can’t hold a conversation or engage in a lengthy, project-oriented back-and-forth. For better or worse, the Siri we use today on our iPhones, iPads, MacBooks, Apple Watches, and Apple TVs isn’t much different from the one we first encountered in 2011 on an iPhone 4s.
Six years ago, I wrote about Siri’s first brain transplant (opens in new tab), the moment Apple began using Machine Learning to train Siri and improve its ability to respond to conversational queries. The introduction of Machine Learning and soon after a built-in neural network in the form of Apple’s A11 Bionic chip on the iPhone 8 marked what I thought was a turning point for arguably the first consumer-grade digital assistant.
This programming and silicon helped Siri understand a question and its context, allowing it to move beyond rote responses to intelligent responses to more natural language questions.
Early Siri was no her
Not being able to fully talk to Siri didn’t seem like a big deal, even though we’d already seen the movie Her and understood what we could ultimately expect from our chatbots.
However, it wasn’t until that once-distant future was brought into the present by OpenAI’s GPT-3 and ChatGPT that Siri’s deficits were thrown into basal relief.
Despite Apple’s best efforts, Siri has been idling in learning mode. Perhaps this is because Siri is still primarily built on Machine Learning and not Generative AI. That’s the difference between learning and creating.
All the generative AI chatbots and image tools we use today create something completely new from prompts and soon art and images. They are not answer bots, they are builder bots.
I doubt any of this is lost on Apple. The question is, what will and can Apple do about it? I think we have no further to look than its upcoming World Wide Developers Conference (WWDC 2023). We’re all fixated on the possible $3,000 mixed reality headset that Apple might show off in June, but the company’s most important announcements will surely revolve around AI.
“Apple must be under incredible pressure now that Google and Microsoft have rolled out their natural language solutions,” Moor Insights CEO and Principal Analyst Patrick Moorhead (opens in new tab) told me over Twitter DM.
A more chatty Siri
As reported in 9to5Mac, Apple may already be working — finally — on its own language generation update for Siri (Bobcat). Note, this is not the same as “generative AI.” I guess that means Siri is getting a little better at casual banter. I don’t expect much more than that either.
Unfortunately, Apple’s own ethos may prevent it from catching up with GPT-4, let alone GPT-3. Observers in the industry do not exactly expect a breakthrough moment.
“I think what they’re doing in AI won’t necessarily be a leap as much as a calculated and more ethically driven approach to AI in Siri. Apple loves, lives and dies by their privacy commitments and I expect no less in how they deliver a more AI-powered Siri,” Creative Strategies CEO and Principal Analyst Tim Bajarin (opens in new tab) wrote to me in an email.
Privacy above all else
Apple’s steadfast adherence to user privacy may leave it hampered when it comes to true generative AI. Unlike Google and Microsoft Bing, it doesn’t have a massive search engine-powered data store to draw on. Nor does it train its AI on the vast ocean of Internet data. Apple does its machine learning on the device. An iPhone and Siri know what they know about you based on what’s on your phone and not what Apple can learn about you and its 1.5 billion global iPhone users. Sure, developers can use Apple’s ML tools to build and embed new AI models on their apps, but they can’t simply collect your data to learn more about you to help Apple deliver a better Siri AI.
As I wrote in 2016: “It’s also interesting to consider how Apple is deliberately hampering its own AI efforts. Your purchasing habits in iTunes, for example, are not shared with any of Apple’s other systems and services.”
Apple’s local approach could hamper it in its possible generative AI efforts. As Moorhead told me, “I see most of the action on the device and in the cloud. Apple is strong on the device but weak in the cloud, and that’s where I think the company will struggle”
As I see it, Apple has a choice to make. Give up a bit of user privacy to finally turn Siri into the voice assistant we’ve always wanted, or stay the course with incremental AI updates that improve Siri but never let it compete with ChatGPT.