Apple’s latest AI (Apple Intelligence) feature called Visual Intelligence will arrive with iOS 18.2. We got the latest beta version which comes with many other features. Visual Intelligence turns your camera into an AI-powered All-Seeing Eye tool. Something like Google Lens. Here’s everything you need to know about using Visual Intelligence on an iPhone running iOS 18.2 or later to supercharge your experience.
What is Visual Intelligence?
If Siri is a voice-based AI that answers your queries, Visual Intelligence is a vision-based AI that lets you point your camera at objects to learn more about them or perform quick tasks.
For example, when you point your iPhone at a restaurant, Visual Intelligence pulls up the restaurant’s rating and hours of operation from Apple Maps. Maybe even their menu. If you’re scanning an event flyer, it automatically offers to add the event to your Calendar or get directions to the location, saving you the hassle of manually entering the details.
Simply point your camera at anything to gather information with ChatGPT or search for similar images on Google. For instance, you can identify the breed of an animal or plant or find the name of a product to buy it online.
Think of Visual Intelligence as a more integrated version of Google Lens with a few unique Apple twists. However, this feature is exclusive to the iPhone 16 series, as it leverages the power of the A18 chip.
How to Activate Visual Intelligence
Visual Intelligence is integrated directly into the Camera Control feature on the iPhone 16. It is part of the iOS 18.2 update, which is currently in developer beta 1 and is expected to roll out by the end of the year. To use it, you’ll need to either switch to the beta channel and update to iOS 18.2 or wait for the stable release.
- Long-press the new Camera Control button. This will open a camera-style interface with a preview and a capture button.
- Once active, point your camera at something you want to search for and tap the capture button to initiate Visual Intelligence.
- Regardless of what’s in the image, you’ll always get two options: Ask and Search.
- Ask: Uploads the image to ChatGPT allowing you to chat and learn more about it.
- Search: Searches the image on Google for similar images or product links to purchase items.
How You Can Use Visual Intelligence in Day-to-Day Life
- Point your iPhone at a landmark, and Visual Intelligence will display its name, historical information, visiting hours, and nearby amenities.
- Point your iPhone’s camera at a book cover or product, and use the Search option to find product reviews, purchase links, and other relevant details.
- Point your iPhone at a plant or animal and use the Ask or Search option to identify the species. You’ll get additional information, such as habitat details, species characteristics, or care tips.
- Point your iPhone at an event poster, and Visual Intelligence will capture the event name, time, date, and location, offering you the option to add it directly to your Calendar.
- If you encounter lengthy text, point your camera at it to generate a quick summary, or use ChatGPT to explain it in simpler terms.
Visual Intelligence is quite a useful tool but it lacks a few features. For example, Google Lens has a video recording feature that allows you to record a video and then perform a search. Useful if the subject is too big for an image. Maybe Apple will add it in a future update.
Ravi Teja KNTS
From coding websites to crafting how-to guides, my journey from a computer science engineer to a tech writer has been fueled by a passion for making technology work for you. I've been writing about technology for over 3 years at TechWiser, with a portfolio of 700 articles related to AI, Google apps, Chrome OS, Discord, and Android. When I'm not demystifying tech, you can find me engrossed in a classic film – a true cinephile at heart.