The iPhone 16 was just announced and it has an amazing new Apple Intelligence feature called Visual Intelligence.
Revealed at the September Apple event ‘Glowtime’, the new incredible AI feature allows you to click the new Camera Control button on the side of the phone and use a multimodal AI to give you search results on the fly. Ever wanted to know what that dog breed walking by is? Sorted.
You can search anything your camera sees using Visual Intelligence and the new Camera Control button on the iPhone 16. Visual Intelligence works similarly to Google Lens, but ‘Designed by Apple in California’. While this new feature is definitely an exciting Apple Intelligence addition, I still wish Google Pixel’s Circle to Search feature was on iPhone.
The iPhone 16 and iPhone 16 Plus are powered by a new A18 chip made from the ground up for Apple Intelligence. The Visual Intelligence feature will be available later this year, meaning you’ll only be able to use the Camera Control button for, you guessed it, the camera at launch.
Visual Intelligence on iPhone 16
The iPhone 16 isn’t just getting a new Camera Control button, it’s also getting the iPhone 15 Pro’s Action button, which is capable of launching any shortcut you can think of.
The new Visual Intelligence feature is also available on the newly announced iPhone 16 Pro and iPhone 16 Pro Max, which now have larger displays compared to last year’s Pro models alongside the A18 Pro chip. The larger 6.3- and 6.9-inch displays are perfect for Visual Intelligence, allowing you to easily frame whatever you want to use AI to search for and view it on a larger display.
Apple Intelligence arrives in beta next month, with future AI abilities like Visual Intelligence and an upgraded Siri with context awareness coming soon after.