How Apple used Google’s help to train its artificial intelligence models
Max A. Cherny
SAN FRANCISCO, June 11 (Reuters) – Apple Chief Executive Tim Cook on Monday announced an agreement with OpenAI to incorporate its powerful artificial intelligence model into its voice assistant Siri.
But in fine print released by Apple after the event, the company makes clear that Alphabet’s Google is yet another winner in the Cupertino, California-based company’s quest to catch up in artificial intelligence.
To create Apple’s basic artificial intelligence models, the company’s engineers used its own software platform with a variety of hardware, namely its own graphics processing units (GPUs) and chips only available in the Google cloud, called processing units (TPUs).
Google has been building TPUs for about 10 years and has publicly announced two types of fifth-generation chips that can be used to train AI; According to Google, the fifth-generation version offers performance competitive with Nvidia’s H100 AI chips.
At its annual developer conference, the company also announced that the sixth generation will be released this year.
The processors are designed specifically for running artificial intelligence applications and training models, and Google has built a cloud computing hardware and software platform around them.
Apple hasn’t explained how much it relies on Google’s chips and software versus hardware from Nvidia or other artificial intelligence providers.
But using Google’s chips typically requires a customer to purchase access to them through its cloud division, just as customers buy computing time from Amazon.com’s AWS or Microsoft’s Azure.
(Reporting by Max A. Cherny; Editing in Spanish by Hector Espinosa)