Different AI models are now readily available, but they all run on remote servers, i.e. in the cloud. On the other hand, Apple expects to achieve a revolutionary breakthrough by embedding artificial intelligence directly into the iPhone.

Whether we talk to ChatGPT or Bard, we need to know that their “knowledge” exists somewhere far away from us, but of course, thanks to the Internet, this is almost imperceptible. Apple, on the other hand, is preparing something completely different: it will put artificial intelligence in the user's pocket. write the company's AI researchers MacRumors By developing innovative technology to utilize flash memory, they achieved the ability to install Large Language Models (LLMs) directly onto iPhones and other Apple devices with limited memory.

See also  Console of Heroes 3 animated company

As I know, LLM-based chatbots require massive data and memory, which can be a problem, for example, for the iPhone, which does not have infinite memory. In other words: High-performance models require a lot of memory for storage, and traditional smartphones, such as the iPhone 15 with 8GB of memory, have difficulty meeting the needs of models with hundreds of billions of parameters, which is why Apple researchers thought to innovate on the memory front, developing technology New uses flash memory to store AI model data. Their method cleverly overcomes this limitation with two key techniques that minimize data transfer and maximize flash memory throughput.

In our research article, the authors mention two types of approaches. “Rewinding” is a kind of recycling method. Instead of loading new data each time, the AI ​​model reuses some of the data it has already processed. This reduces the need to constantly fetch memory, making the process faster and smoother.

Row-to-column stacking is a similar technique to reading a book in larger chunks rather than one word at a time. Through more efficient collection, data can be read from flash memory faster, accelerating the AI's ability to understand and generate language.

Combining these approaches allows AI models to run on up to twice the iPhone's available memory. This represents a 4-5x speed increase over standard processors and 20-25x faster on GPUs. “This breakthrough is particularly important for disseminating advanced MBAs in resource-constrained settings, thereby expanding their application and accessibility,” the researchers wrote.

This advance in AI efficiency opens up new possibilities for future iPhones, such as more advanced Siri capabilities, real-time language translation, and cutting-edge AI-driven features in photography and augmented reality. This technology also lays the foundation for iPhones to run complex AI assistants and chatbots on the same device.

See also  Nokia invented the surround sound phone call

If you want to know about similar things at other times, like it HVG Tech Department Facebook page.