Apple may eventually rely on its own server processors to power its AI systems, but for now it is forced to rely on third-party components. Unlike many of its competitors, who rely heavily on Nvidia in this area, Apple has relied on Google’s tensor processors.
As noted CNBCthis follows from an explanatory note from Apple, which specifies the configuration of computing clusters used by the company to train its large language models, which will form the basis of Apple Intelligence technology. Its preliminary version has already begun to be demonstrated on some devices, and therefore Apple has decided to disclose information about the specifics of the development of its artificial intelligence infrastructure.
Apple does not directly mention the developer of the processors that it used to train its language model Apple Foundation Model, but the text of the note includes the wording “cloud clusters based on TPUs.” This is the abbreviated name that Google uses for its tensor processors. Among other things, this revelation by Apple speaks of the company’s use of cloud computing resources leased from Google. For the stage of development of Apple’s artificial intelligence systems, this is a completely justified approach.
The large language model that will run on Apple’s end devices was trained using a cluster of 2,048 Google v5p processors, which is considered the most modern. The server part of the model was trained on a cluster with 8,192 v4 processors. Google rents out such clusters for $2 per hour for each processor used by the client. At the same time, Google trains its own language models using Nvidia hardware, and not just its own processors. Apple does not indicate in its documentation that Nvidia chips may be used.
If you notice an error, select it with your mouse and press CTRL+ENTER.