Apple Unveils Exciting “Visual Intelligence” Feature Coming to iOS 18
Apple has officially revealed a new innovative feature called “Visual Intelligence” that will be added to the Apple Intelligence features in the upcoming iOS 18 operating system for iPhones. This feature will be dedicated to leveraging services through the camera and photography.
Apple held its annual event today at its headquarters in the United States under the title It’s Glow Time, where it revealed the iPhone 16 and iPhone 16 Pro, along with the new generation of its smartwatches, Apple Watch Series 10, as well as the new earbuds, AirPods 4.
Apple had discussed upcoming artificial intelligence features for its operating systems at the WWDC developer conference last June. However, there are other features the company is working on, and it unveiled the “Visual Intelligence” features today at the It’s Glow Time conference.
The company stated that Visual Intelligence features allow users to instantly recognize objects seen through their phone cameras, providing an interactive experience and quick information about anything the camera is directed towards.
Craig Federighi, Apple’s Vice President of Software Engineering, explained that this feature is activated through the new touch-sensitive camera button, now part of the iPhone 16 and iPhone 16 Pro devices.
How Visual Intelligence Works
Users can activate Visual Intelligence by long-pressing the new side camera button, then pointing the camera towards the object they want more information about. Thanks to a combination of artificial intelligence technologies on the device and Apple’s cloud services, users can receive accurate information without storing the photos on their phones.
For example, taking a picture of a restaurant to get information about its hours of operation, or pointing the camera at an advertising poster to automatically record details like the address, date, and location.
Federighi also mentioned that Visual Intelligence will serve as a gateway to integrate third-party AI services and models, meaning the feature can be used to search for products or items in the real world through services like Google, or take a picture of study notes for immediate assistance in understanding.
While Apple did not specify an exact release date for this feature, it confirmed that it will be available as part of the camera control feature later this year, adding a new and powerful element to the iPhone user experience.