The tech world is abuzz with the latest update from Apple: the rollout of ‘Visual Intelligence’ to the iPhone 15 Pro. This feature, reminiscent of Google Lens, is set to transform how we interact with our smartphones, bringing a new level of intelligence to our everyday tasks.
Originally showcased with the iPhone 16 series and its ‘Apple Intelligence’ capabilities, ‘Visual Intelligence’ is now making its way to the iPhone 15 Pro via the iOS 18.4 update. This update will leverage the Action button, enabling users to perform image-based searches with remarkable ease.
Imagine pointing your iPhone at an object and instantly receiving detailed information. This is the power of ‘Visual Intelligence.’ It’s not just about identifying objects; it’s about enhancing our understanding of the world around us. This feature is a testament to Apple’s commitment to pushing the boundaries of AI integration in mobile technology.
The iOS 18.4 beta is already in circulation, and the full release is anticipated in April, accompanied by a significant Siri upgrade. While an exact launch date for ‘Visual Intelligence’ remains undisclosed, the excitement is palpable.
This update raises interesting questions about the future of smartphone AI. How will features like ‘Visual Intelligence’ impact our daily lives? What other innovations can we expect from Apple in the realm of AI?
For a comprehensive look at this exciting development, I encourage you to read my detailed article: