Live Text is a new feature in iOS 15, iPadOS 15, and macOS Monterey that allows users to select, translate, and search the text found in any image.
One of the new features announced at WWDC this year was Live Text. Live Text is coming to iOS 15, iPadOS 15, and macOS Monterey, allowing users to select, copy, search, and even translate the text found in any image.
The demonstration used during the keynote was a meeting board with handwritten text. When you open the Camera app on the iPhone and point it at the whiteboard, a small indicator appears in the lower right corner showing that the text in the viewfinder is recognized.
Tapping the indicator allows you to copy or share the text wherever you want. During the demo, Apple Vice President Craig Federighi pasted text into a bullet point email message.
This not only works when capturing a photo, but the live text can be used with any photo found in the Photos app to select and share text. Each time the indicator appears in the lower right corner of the screen, you can select some text from the image.
Combined with system-wide translation on new versions of iOS, iPadOS, and macOS, users can select text in a photo and tap the Translate command to read it in another language without ever leaving the image. For those traveling internationally, this feature could help to read street signs, menus, and many other use cases limited only by imagination.
If an image includes a phone number or an address, users can touch the text in the image to make a call or search for a location. Even a restaurant sign can trigger a map search for that workplace, and getting directions is just a few taps away.
Live Text will also make the words captured in a photo searchable with Spotlight. Searching for a word or phrase will reveal photo results that users can tap to jump directly to that photo in their library.
In addition to the text functions, Apple has incorporated a search function for objects and scenes. When pointing the camera or viewing a photo of art, books, nature, places of interest, and pets, the device will display relevant information with links to more information.
Live Text will recognize seven languages at launch, including English, Chinese, French, Italian, German, Spanish, and Portuguese. This feature also requires the A12 Bionic chip or newer, which was found in the iPhone XS.
Stay on top of all the Apple news right from your HomePod. Say “Hey Siri, play AppleInsider” and you’ll get the latest AppleInsider podcast. Or ask your HomePod mini “AppleInsider Daily” instead and you’ll hear a quick update straight from our news team. And, if you’re interested in Apple-centric home automation, say “Hey Siri, play HomeKit Insider” and you’ll be listening to our new specialized podcast in no time.