Here’s why the new iPhones and Vision Pro can’t run Apple Intelligence

Updated June 24: Daring Fireball’s John Gruber explains why Private Cloud Compute can’t support all of Apple Intelligence’s features. If you were watching Apple’s demonstration of key Apple Intelligence features during the WWDC keynote on Monday, you were probably thinking about all the ways you could use the new service on your iPhone this fall. […]

Here’s why the new iPhones and Vision Pro can’t run Apple Intelligence

Updated June 24: Daring Fireball’s John Gruber explains why Private Cloud Compute can’t support all of Apple Intelligence’s features.

If you were watching Apple’s demonstration of key Apple Intelligence features during the WWDC keynote on Monday, you were probably thinking about all the ways you could use the new service on your iPhone this fall. However, once completed, many iPhone users were dismayed to learn that it wouldn’t work on their phone: Apple Intelligence is banned from all but the newest and most expensive phones.

While Macs and iPads from 2020 will benefit from the benefits of Apple Intelligence, support for the iPhone line is limited to the 15 Pro and 15 Pro Max. That leaves out two of Apple’s newest phones released just a few months ago, as well as all the older models still on sale and the iPhone SE.

However, while this may seem like an odd move since the A16 chip in the iPhone 15 and iPhone 15 Plus is very fast, a new report from Ming-Chi Kuo sheds some light on things. As he notes, the A16 chip’s neural engine power is actually higher than the M1’s (17 trillion operations per second versus 11 TOPS), so the requirements aren’t for the NPU. Rather, it has to do with memory: the A16 chip has 6 GB of RAM compared to at least 8 GB on all devices that support Apple Intelligence.

He breaks it down even further: “The demand for DRAM can be verified in another way. Apple Intelligence uses an LLM 3B on the device (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7 to 1.5 GB of DRAM must be reserved at any time to run the Apple Intelligence LLM on the device.

At Daring Fireball, John Gruber explains why devices that don’t have enough memory can’t simply use Private Cloud Compute for most tasks: “The models that run on the device are entirely different from those that run on the device. ‘run in the cloud, and one of these on-device models is the heuristic that determines which tasks can be run with on-device processing and which require Private Cloud Compute or ChatGPT. He also says that Vision Pro does not benefit from Apple Intelligence because the next-generation device “already makes significant use of the M2’s Neural Engine to complement the R1 chip for real-time processing purposes – occlusion and detection.” objects, things like that.”

Rumors have previously claimed that all iPhone 16 models will have 8GB of RAM, and based on Apple Intelligence requirements, that is almost certainly the case. Kuo also speculates that future devices will likely start with 16GB of RAM as Apple Intelligence “most likely evolves to a 7B LLM.” Some smartphones, like the OnePlus 12 and Xiaomi 14, already have 16 GB of RAM.

If you’re a coder, the situation is a little worse. The new predictive code completion AI in Xcode 16 requires an Apple Silicon Mac with 16 GB of RAM, according to Apple’s documentation.

When Apple Intelligence arrives with iOS 18 this fall, it will still be in beta. However, reports indicate that it will nevertheless be a key feature of the iPhone 16.

Teknory