Apple prepares its own "Chat GPT" that will work directly from the iPhone
The Cupertino company seemed far from competing in the Artificial Intelligence sector, but it seems that it continues to patiently develop its own technology. According to research by experts at the company, they believe it will be a rival to ChatGPT, but with the difference that it will not be cloud-based.
This information has been shared in a new report titled 'LLM in a flash', prepared by a group of Apple researchers. According to that study, Flash storage is more abundant in smartphones than the RAM commonly used in LLMs.
The researchers have highlighted two distinct techniques for enabling such features. First, we have windowing, when the AI reuses part of the processed data and recycles it. Next, we find "Row-Column Bundling", which regroups data more efficiently, accelerating the AI's ability to generate language.
These methods could mean the arrival of an 'Apple GPT' that would work directly from the mobile, even if we do not have a connection to data networks. These techniques would enable a 4-5x increase in AI processing speed on the mobile CPU and up to 20-25x on its GPU. "This breakthrough is especially crucial for deploying advanced LLM in resource-constrained environments, thus extending its applicability and accessibility."