A new behind-the-scenes report in The Information details Apple’s struggles to keep pace with AI capabilities and innovation amid the rise of large language models (LLMs) that power cutting-edge tools like ChatGPT.
The article focuses on the efforts of the company’s AI chief since 2018, John Giannandrea, to bring order to a fragmented AI group and make Apple more competitive with companies like Google, from which Giannandrea defected.
In some ways, The Information’s piece is a recap or confirmation of what we already know — like Apple employees’ frustrations with the limitations of Siri’s underlying technology, which had previously been reported — but it calls on new sources to add further context and depth to the narrative.
For example, it reveals that the team working on Apple’s long-in-development mixed reality headset was so frustrated with Siri that it considered developing an entirely separate, alternative voice control method for the headset.
But it goes beyond simply reporting on neutral details; rather, it lays out all that information in a structured case to argue that Apple is ill-prepared to compete in the fast-paced AI space.
Think differently, indeed
As Google restructures itself to bet on products like Bard and Microsoft injects ChatGPT and related AI capabilities into a wide range of products from Bing to Word to GitHub, Apple’s recent approach to artificial intelligence has been different; it has almost exclusively focused on practical applications in features for the iPhone. Emphasis is placed on using machine learning to improve palm recognition on the iPad, give iPhone users more neat photo-editing tricks, and improve suggestions in Apple’s content-oriented apps, among other similar things.
It’s a different tactic than the ambitious, blue-sky experimentation and innovation you see from companies like OpenAI, Microsoft or Google. Apple has been relatively conservative, trying to use artificial intelligence and machine learning as a tool to improve the user experience, not to really reinvent how much of anything gets done or disrupt existing industries.
Indeed, The Information’s sources provide several examples of senior Apple management putting the brakes on (or at least reining in) aggressive efforts within the company’s AI group for fear of seeing products like Siri exhibit the same kinds of embarrassing factual errors or inappropriate behavior as ChatGPT and its ilk have done. In other words, Apple is not keen on tolerating what many who work in AI research and product development call “hallucinations”.
For example, Siri’s responses are not generative – they are human-written and human-curated. Apple’s leadership has been hesitant to allow Siri developers to push the voice assistant toward detailed back-and-forth conversations like you see in recent LLM-powered chatbots. They are seen as more attention-seeking than useful, and Apple is worried about being responsible for bad responses.
Some engineers within the company have argued that Apple should be more tolerant of bizarre edge cases and factual errors, saying that a certain scale and comfort is needed to truly improve them. Notably, several senior figures at the company have left ship for Google or startups due to frustrations with Apple’s conservative thinking.
In addition, Apple has increasingly focused on running AI and machine learning capabilities on users’ local devices—both because it enabled faster response times and because of the company’s public commitment to user privacy. For some features, that’s an advantage (as Giannandrea explained to Ars Technica in 2020). But to date, LLMs typically run in the cloud, and some have questioned whether they will ultimately work as well on on-premises devices.
Nevertheless, The Information’s sources say that Apple engineers have already started working on some major LLM-powered features, and that the company hopes to introduce them in an iOS update next year. However, we don’t yet know what those features will be, nor do we know anything about the approach Apple is taking in developing and implementing them.
Analysis: Winning the race might not be everything
There’s no doubt that Apple (at least what we can see from the outside) is lagging behind its Big Tech competitors in radical new AI innovations, even though its software and devices are now packed with small AI-powered features that improve the user experience in small but meaningful ways. And commentators are right to question whether Apple can compete when its approach has historically been so conservative.
That said, there are plenty of people who say the rapid development of ChatGPT and its ilk and Microsoft’s gung-ho approach with Bing Chat could prove reckless, with huge, potentially negative unintended consequences. Apple’s conservative streak may be the right move in the long run — at least when it comes to minimizing externalities.
The hype around generative AI and LLMs is strong and for good reason. But we don’t yet know exactly how all this will play out. It’s never been in Apple’s DNA to roll the dice to find out. Rather, the company has sometimes found its greatest successes in picking up the leftover pieces after other, more ambitious innovators crashed, burned, and took others with them.
That said, it’s understandable that ambitious AI developers want to work in an environment relatively unencumbered by bureaucracy and restrictions. The information makes a particularly compelling case: Apple’s brain drain could ultimately be the company’s undoing as it seeks to compete with Google, Microsoft and others—even more than any difference in philosophy. And the explosive innovation in this particular space may be different from the markets where Apple has used its usual strategy.
Right now, Apple is thriving compared to most of its competitors, but new developments in artificial intelligence could threaten its position in the long run. It will be fascinating to see what Apple does with the new LLM features hinted at in The Information’s report – will it compromise its commitment to virtually no errors, or will it loosen things up to stay competitive?
There’s no way for us to know now, but we’ll probably find out in the next year or two.