Live Text, which was a new feature added back in iOS 15 has received some pretty useful improvements in iOS 16. So today we’re going to briefly go over what’s new with Live Text in iOS 16. If you need a refresher, Live Text is designed to let you highlight and select written or type text from photos interacting with it, as you would text in any other place inside of the operating system. It’s pretty crazy that you can write down a phone number in the notes app or email. If that’s inside of a photo, you can do a long press on either one and make a call or compose a new email.
One of the major new additions with Live Text is the ability to do that within a video rather than just being able to select text inside photos. You can pause any video and interact with the text as you would inside an image. This means the text in any paused video frame can be copied, pasted, and translated. Plus it works with Lookup! Live Text works across the operating system in photos, cameras, safari, and even other applications. So again if you want to copy text just pause the frame of the video that you want, and then touch and hold on to a word. You can grab the two little blue markers to narrow or widen your selection, and then you can press and hold to get options like translate, and copy. Text that is in a photo or video also supports quick actions something that Apple added in ios 15.
Now in ios 16, there are new quick actions that let you track flights, track packages, translate languages, and even convert currencies right from a photo or video. If you have a photo with a sign in a foreign language, for example, you might see the translate quick action available. Live Text can also be found inside the translate app, as there’s a new camera button that opens the device’s camera. You can get live translations in real-time for different languages. Speaking of different languages, there are three new languages including Japanese, Korean, and Ukrainian.
Lastly in iOS 16, Live Text will work with Spotlight search. For example, if you are looking for text in a photo or video frame, you can just search for that keyword and you’ll see it pop up inside of the Spotlight search. This is something that I did last week without even really realizing it. I was looking for a picture of my vaccine card, so I typed in the keywords, and then it showed me the picture. It work that way because it read the text inside of the picture and then was smart enough to bring it up in the spotlight search.
This is a fantastic feature that worked perfectly when I needed it to. So what about you? I would love to know your thoughts on Live Text, how you’re using it, and what you think about these new features from ios 16 in the comments down below!