Apple has announced a new feature called Live Text, which will digitize the text in all your photos. This unlocks a slew of handy functions, from turning handwritten notes into emails and messages to searching your camera roll for receipts or recipes youāve photographed.
This is certainly not a new feature for smartphones, and weāve seen companies like Samsung and Google offer similar tools in the past. But Appleās implementation does look typically smooth. With Live Text, for example, you can tap on the text in any photo in your camera roll or viewfinder and immediately take action from it. You can copy and paste that text, search for it on the web, or ā if itās a phone number ā call that number.
Apple says the feature is enabled using ādeep neural networksā and āon-device intelligence,ā with the latter being the companyās preferred phrasing for machine learning. (It stresses Appleās privacy-heavy approach to AI, which focuses on processing data on-device rather than sending it to the cloud.)
Live Text works across iPhones, iPads, and Mac computers and supports seven languages: English, Chinese (both simplified and traditional), French, Italian, German, Spanish, and Portuguese. It also integrates with Appleās Spotlight search feature on iOS, allowing you to search your camera roll based on the text in images.
In addition to extracting text from photos, iOS 15 will also allow users to search visually ā a feature that sounds exactly the same as Google Lens and that Apple calls Visual Look Up.
Apple didnāt go into much detail about this feature during its presentation at WWDC, but it said the new tool would recognize āart, books, nature, pets, and landmarksā in photos. Weāll have to test it out in person to see exactly how well it performs, but it sounds like Apple is doing much more to apply AI to usersā photos and make that information useful.