iPhone 15 Pro Owners will soon have a lesser reason to consider upgrading to the iPhone 16 series phone. Visual intelligenceApple is equivalent to Google Lenswill arrive at the 2023 series flagship store according to Bold fireball.
owner iPhone 16 and 16 Pro Visual intelligence can be triggered with long pressure from dedicated camera buttons. But like Recently announced iPhone 16e (This feature is also supported), the iPhone 15 Pro and Pro Max do not have physical camera buttons. Therefore, all three phones have to assign it to Action button Or use the Control Center shortcut that will arrive in an upcoming software update.
Apple hasn’t said which iOS version brings Apple Intelligence Visual Search to the iPhone 15 Pro series. However, Bold fireballJohn Gruber suspects it will be in iOS 18.4, which could come to beta testers “any day now”.
Part Apple Intelligence Suite with AI FeaturesVisual intelligence allows you to point your camera at something and use AI to analyze it in real time. It does something on its own, but from the ongoing screen shortcuts to chatgpt or Google Image Search.
So, suppose you find a set of towels with unique patterns in your closet. You really like those dang towels and want to buy more, but you don’t remember where you bought them from. Activate Visual Intelligence, select Google Search Shortcuts to see if your beloved rag belongs to the pop-up network results. Also, you can use Chatgpt to ask it about the product and where to order it.
Visual intelligence can do something without the help of Google or OpenAI. You can interact with the text: translate, read aloud and summarize. Or learn about the business you point your phone to: view its time, menus, and services or buy something from it.
So, iPhone 15 Pro and Pro Max owners should be able to taste it soon. Perhaps faster for those willing to be brave (sometimes rough) beta software waters.