Google’s Role in Apple’s AI Success: A Closer Look

Join Our WhatsApp Channel Join Now
Join Our Telegram Channel Join Now

Today, on Monday, Apple led by its CEO Tim Cook unveiled yet another fancy partnership with OpenAI, making its incredibly powerful AI system part of its voice assistant, Siri. Apple with its elegant manufactured products and easy operating systems has come a long way from being a mere hardware company – an AI giant. Nevertheless, it has been reported that Google has been quietly instrumental in the AI advancements of Apple. So in this blog post let me share with you how Apple has been benefited from Google expertise and infrastructure in AI.

Apple

Google’s Role in Apple’s AI Success

However, Apple being specific in the small print in a technical document that it released after the event indicated that Google under Alphabet was a clear winner in the Apple’s efforts to create a new generation of artificial intelligence. For creating the foundation AI models that Apple required, its engineers employed the company’s proprietary framework software to operate with other equipment and its in-house on-premise and GPUs and chips who are identifiable exclusively in Google Cloud as TPUs. Google has been designing and manufacturing the TPUs for roughly 10 years and has talked publicly for two specific versions of the company’s fifth-generation chips which can use applied in AI training; the performance version of the fifth-generation is roughly as effective as the Nvidia H100 AI chips according to Google.

Apple and Google researchers are famous for joining their forces in developing AI systems. They combine experiences and results, and, thus, achieve faster advancements in such fields as natural language processing, computer vision, and speech recognition. It helps both parties and the overall AI sector as a whole to evolve and advance. The technology giant disclosed in its annual developers’ meeting that it would release a sixth generation in the year.

Transfer learning is a process also used by Apple, in which an image recognition network that is trained for some purposes, is trained again for a given task. The underlying artificial intelligence applications in Apple rely on Google pre-trained models including BERT (Bidirectional Encoder Representations from Transformers). In this manner, more accurate and complex models can be created by starting from the information that Apple already possesses and extending it further.

Apple prioritizes user privacy. Google’s work in the field of differential privacy, which aims at providing each person’s data with a particular level of privacy by introducing noise into the data, has been beneficial to Apple’s approach to AI. Through these methods, then, it remains possible for Apple to collect and analyze user data while keeping it private.

With reference to the current advancement in AI, there is probably a prospect of a more strengthened partnership between Apple and Google. More so, whether it is Siri that can now be programmed to be more accurate in understanding user voice or Photo app in which users can be able to search for images more easily, both companies understand the benefit of partnership.

The processors are for AI compute and model training, and all of Google’s cloud hardware and software are built to answer these needs.
Apple and Google were asked to shed light on the issue but the companies did not respond to the request instantly.

Apple was not clear about the degree to which it depended on Google’s chips and software over the chips and software offered by AI competitors like Nvidia.
However, integrating with Google’s chips often involves a client having to license or purchase their usage from the company through Google’s cloud interface, much like how customers buy processing time from Amazon. as Google’s AWS or Microsoft’s Azure for the delivering of its services.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top