Next year, Siri will become natively aware of on-screen content, allowing it to execute more advanced tasks inside apps.
iOS 18.2 developer beta 1 introduces Visual Intelligence exclusively on the iPhone 16. Through it, users can scan their surroundings and have ChatGPT or Google's reverse image search provide details about the location or subject. Notably, iOS 18.2 also packs a hidden alternative to this feature, allowing iPhone 15 Pro, iPad, and Mac users to rely on ChatGPT to analyze images and on-screen content.
As spotted by 9to5Mac, Siri on iOS 18.2 lets users directly identify content on their screens with the help of ChatGPT. When you ask Siri to rely on ChatGPT to identify a building on your iPhone, for example, it'll automatically take a screenshot. Once you approve sending the screenshot to OpenAI's servers, ChatGPT will describe what it sees. You could then send follow-up questions if you're interested in learning more.
So, those without access to Visual Intelligence can now just take a regular photo on iOS 18.2, open it, and ask ChatGPT to analyze it. It certainly consumes more time than the iPhone 16's feature, but it's a handy replacement for those with iPhone 15 Pro models or M-powered iPads and Macs.
By mid-2025, Siri will be able to detect on-screen content natively and complete tasks inside apps without the help of ChatGPT. Until then, users can rely on the previewed workaround to inquire about the things they encounter in daily life.
The second batch of Apple Intelligence features, including ChatGPT support, will launch in December as part of iOS 18.2. Though, those with spare devices can try the first developer beta right away.