Apple has introduced a new feature called Visual Intelligence with the iPhone 16. From the demo that Apple showed as part of its September 2024 event, it seems like Visual Intelligence is Apple’s version of Google lens.
The new feature is activated by a brand new touch-sensitive button on the right side of the device called Camera Control. With just a click, Visual Intelligence can identify objects, provide information, and offer actions based on what you point it at. Aim it at a restaurant to instantly pull up menus and ratings, or snap a flyer for an event to add it directly to your calendar. Curious about a dog’s breed? Point and click to find out. Eyeing a bike for purchase? Click to search for it online.
Apple claims that Visual Intelligence is private, which means that the company does not know what you clicked.
This is a developing story…
Catch up on all the news from Apple’s iPhone 16 event!