Apple has revealed its choice to utilize Google’s Tensor Processing Units (TPUs) for its new AI software infrastructure, as detailed in a research paper published on Monday. This marks a significant departure from the industry norm, where Nvidia’s GPUs are typically the go-to for AI model training.
According to the latest technical paper from Apple, the company’s AFM and the AFM server that supports it operate on Google’s Cloud TPU clusters. These details suggest that Apple has subleased the server space from Google cloud computing facilities for the complex computations required in its AI frameworks.
Apple unveils new AI system
Apple also unveiled its Artificial Intelligence system, Apple Intelligence, along with the technical paper. The new system has an improved Siri, improved natural language processing, and AI summaries. Moreover, in the next year, Apple intends to incorporate generative AI capabilities into the system, including image and emoji generation and an enhanced Siri that can retrieve and use information from apps.
The technical paper also stated that Apple’s AFM on-device was trained with 2048 TPU v5p chips and the AFM-server with 8192 TPU v4 chips. The current model is Google’s TPU v5p, which was announced in December and enhances the efficiency of AI computation. Google’s TPUs, which came into the market in 2017, are still among the most sophisticated custom chips for running AI operations.
NVidia has remained the dominant supplier of GPUs used in training neural networks. However, because of their high costs and restricted accessibility, organizations have tried to look for other alternatives. Some of Nvidia’s key customers are OpenAI, Microsoft, and Anthropic. However, other tech giants such as Google, Meta, Oracle, and Tesla also express interest in the development of AI capabilities.
Tech leaders weigh the risks and benefits of AI investment
Meta’s CEO Mark Zuckerberg and Alphabet’s CEO Sundar Pichai recently expressed concerns about overinvestment in the AI infrastructure but agreed that lack of investment poses a high risk of losing ground in a technology that will be the future game changer. While being one of Nvidia’s largest customers, Google also employs TPUs to train AI models and offers Nvidia GPUs via cloud services.
As stated by Apple, Google’s TPUs enable the building of more complex and larger AI models. The detailed research paper that also incorporates information disclosed at Apple’s Worldwide Developers Conference in June indicates that these TPUs offer the computational might that is required by the company’s advanced AI plans. When Apple releases its AI features to beta users this week, the effects of its hardware decisions will be more apparent.