Apple has announced a new feature called Live Text, which will digitize the text in all your photos. This unlocks a slew of handy functions, from turning handwritten notes into emails and messages to searching your camera roll for receipts or recipes youâ€™ve photographed.
This is certainly not a new feature for smartphones, and weâ€™ve seen companies like Samsung and Google offer similar tools in the past. But Appleâ€™s implementation does look typically smooth. With Live Text, for example, you can tap on the text in any photo in your camera roll or viewfinder and immediately take action from it. You can copy and paste that text, search for it on the web, or â€” if itâ€™s a phone number â€” call that number.
Apple says the feature is enabled using â€śdeep neural networksâ€ť and â€śon-device intelligence,â€ť with the latter being the companyâ€™s preferred phrasing for machine learning. (It stresses Appleâ€™s privacy-heavy approach to AI, which focuses on processing data on-device rather than sending it to the cloud.)
Live Text works across iPhones, iPads, and Mac computers and supports seven languages: English, Chinese (both simplified and traditional), French, Italian, German, Spanish, and Portuguese. It also integrates with Appleâ€™s Spotlight search feature on iOS, allowing you to search your camera roll based on the text in images.
In addition to extracting text from photos, iOS 15 will also allow users to search visually â€” a feature that sounds exactly the same as Google Lens and that Apple calls Visual Look Up.
Apple didnâ€™t go into much detail about this feature during its presentation at WWDC, but it said the new tool would recognize â€śart, books, nature, pets, and landmarksâ€ť in photos. Weâ€™ll have to test it out in person to see exactly how well it performs, but it sounds like Apple is doing much more to apply AI to usersâ€™ photos and make that information useful.