Tim Cook recently identified one specific AI feature – Visual Intelligence – as one of the most popular Apple Intelligence features. And rumors suggest it could be much stronger later this year.
Visual intelligence is the first hit among Apple users, says Tim Cook

Visual Intelligence first debuted on the iPhone 16 as part of the camera button.
On all iPhone models with AI, you can long press Camera Control to enter Visual Intelligence mode. Or optionally set the Control Center or Lock Screen button to do the same.
Visual Intelligence essentially combines your camera with artificial intelligence to provide new features such as:
- translating the sign into your native language
- adding an event from the flyer to the calendar
- browsing restaurant reviews, photos and more
When iOS 26 launched, it expanded visual intelligence in a big way. It is no longer limited to just the camera. Now everything you see on your iPhone can use AI through screenshots.
Take a screenshot in iOS 26, and as part of the redesigned screenshot viewer, you’ll now find visual intelligence options like the ones above.

One of my favorite uses was to take a screenshot of URLs in plain text to turn them into clickable links.
At Apple’s quarterly earnings call last week, Tim Cook specifically named visual intelligence as one of Apple Intelligence’s most popular features. He said:
One of our favorite features is visual intelligence, which helps users learn and do more with content on the iPhone screen than ever before, making it faster to search, take actions, and answer questions across apps.
Why did Cook choose to specifically promote the success of visual intelligence? We can’t know for sure.
However, one reason may be that Cook knows that Apple has big plans to expand this feature soon.
The new AirPods Pro and Apple Glass are rumored to use visual intelligence

Visual Intelligence is nice as a camera feature, and I especially found it useful as a screenshot option.
However, rumors suggest that the feature will soon be expanding to two new platforms, which could make it really shine.
A new top-of-the-line AirPods Pro 3 model and Apple Glasses are expected to be unveiled later this year.
Both products will have built-in cameras coupled with a key AI feature. You guessed it: visual intelligence.
Here’s Mark Gurman writing about the new AirPods:
Apple’s ultimate plan for visual intelligence extends far beyond the iPhone. The company wants to build this feature into the core of future devices, including the camera-equipped AirPods I wrote about a few months ago.
He similarly says about Apple Glasses:
the idea is to turn the glasses into an Apple Intelligence device. The product will analyze the surrounding environment and provide information to the wearer
In other words, Apple plans for visual intelligence to be a big part of its wearable offerings in the not-too-distant future.
So if there was one AI feature that Cook might want to draw particular attention to, it’s no surprise that it was visual intelligence.
What are you using visual intelligence on your iPhone for today? Let us know in the comments.
The best accessories for iPhone


FTC: We use automatic income earning affiliate links. More.
