Skip to main content

How to use Live Text in iOS 16 for your iPhone

While iOS 16.1 is available, iOS 16 as a whole comes packed with a variety of new features including Live Activities, iCloud Shared Photo Library, and new collaboration tools.

It’s not all about new features, though, as it also brings a huge update to iOS 15’s excellent Live Text feature that arrived last year.

The clever feature became incredibly useful since its inception, letting users grab phone numbers, addresses, and plenty more from images. The good news is that it's now even better – so with this in mind, here’s what’s new for Live Text, alongside how to use it for your iPhone in iOS 16.

What is Live Text and what’s new in iOS 16?

iOS 16 live text

(Image credit: Future)

Live Text, in its simplest form, lets you manipulate text found within an image. For example, whether you’ve taken a photo of a restaurant menu or took a photo of a business card so you won’t forget it, Live Text will let you copy the text out and use it to send a message, create a note, use an email address, or make a call.

This is done using on-device intelligence, but that does mean that you’ll need an iPhone XS, XR or later. 

It’ll also work on iPadOS 16.1, released in October 2022, so you can use the feature across your devices when needed.

However, if you have an iPhone capable of running iOS 16, the headline new feature for Live Text this year is the addition of pulling text from video content.

The idea is that if you pause, for example, an educational video, and need to pull down some of the data to add it to your notes, you can do so by pausing the video and highlighting the text.

Another big new feature is the inclusion of 'Quick Actions' for Live Text. This allows you to take a photo of text in another language, for example, and instantly translate it.

Other quick actions available include starting an email when finding an email address, calling phone numbers, converting currency and plenty more.

How to use Live Text in the Camera app

iOS 16 live text

(Image credit: Future)

In the Camera app of iOS 16, hold your device so that the text is clearly visible on your screen and tap the icon that looks like a barcode scanner in the corner of the viewfinder.

Doing so brings up the Live Text overlay, and lets you highlight parts of the text shown. You can also tap the 'Copy All' Quick Action in the corner, which will copy all of the text to your clipboard so you can paste it into an app immediately.

iOS 16 live text

(Image credit: Future)

In the example above, you can see that Live Text gets a little muddled thinking the image is upside down, but copying the text will, naturally, revert it to the correct orientation.

How to use Live Text in the Photos app

iOS 16 live text

(Image credit: Future)

The same process applies to images and videos in the Photos app. Simply tap the aforementioned “Barcode” button and you’ll get the option to “Copy All” or highlight individual parts of the text.

This can be particularly helpful if you haven’t got time to hover and copy down some information – simply grab a photo or video, and come back to it in your own time.

How to use Live Text in the Translate app

iOS 16 live text

(Image credit: Future)

One of the big additions in iOS 16 is being able to access Live Text through the impressive Translate app. If you’ve not used the app before, you’ll find it in your App Library, or via a Spotlight search.

Once you’re in, there’s a new option at the bottom of the screen to open the camera. It won’t translate in real-time insofar as you’ll need to take a photo, but the app does work incredibly quickly.

You can pick the language after the fact, too, meaning if you need to translate a phrase into multiple languages, you can do so with ease.



Source: TechRadar

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Here Are the macOS Sequoia Features Intel Macs Won't Support

When Apple released macOS Monterey in 2021, some key features required a Mac with Apple silicon. The same scenario played out with macOS Ventura in 2022, and then again the following year with the release of macOS Sonoma. With macOS Sequoia set to arrive in the fall, which new features can Intel Mac owners expect to be unavailable to them this time around? Apple says that macOS Sequoia is compatible with the same Macs as macOS Sonoma, but Apple's fine print reveals that certain new features won't work on Intel machines. If you're still on an Intel Mac, here's what you won't have access to. Apple Intelligence Apple Intelligence , a deeply integrated, personalized AI feature set for Apple devices that uses cutting-edge generative artificial intelligence to enhance the user experience, won't be available on Intel Macs. Apple says the advanced features require its M1 chip or later, so if your Mac was released before November 2020, you're out of luck. T

iPhone 16 Pro Models to Adopt 'M14' Advanced Samsung OLED Panels for Improved Brightness and Lifespan

The upcoming iPhone 16 Pro and iPhone 16 Pro Max will be the first Apple smartphones to adopt Samsung's high performance "M14" OLED display panel, claims a new report coming out of South Korea. According to ETNews , Samsung's "M" series of OLED panels are made for flagship smartphones, while "14" refers to the number of high-performance materials used to produce them. "M14" is the first series of its kind, and the panel is said to have been developed to deliver superior brightness and longevity. Samsung has reportedly placed orders for the M14 materials and is preparing to mass produce the displays in the second half of the year for Apple's iPhone 16 Pro models. Google's Pixel 9 smartphone is the only other device that is expected to adopt the high-performance displays in 2024. A previous report out of China claimed that this year's ‌iPhone 16 Pro‌ models will feature up to 1,200 nits of typical SDR brightness – a 20%

Apple Boosts A18 Chip Orders in Anticipation of High iPhone 16 Demand

Apple is said to have upped its order of next-generation chips from TSMC to between 90 million and 100 million units, following heightened demand expectations for its iPhone 16 series. Last year's initial chip order volume for the iPhone 15 series launch is believed to have been in the region of 80-90 million units, suggesting Apple is anticipating higher demand for its 2024 devices, according to Taiwanese outlet CTEE . The arrival of Apple Intelligence in iOS 18 is expected to boost initial sales of the devices. One of the reasons is that Apple Intelligence requires at least an iPhone 15 Pro to run, which means owners of last year's iPhone 15 and iPhone 15 Plus will miss out on Apple's new AI features unless they upgrade to an iPhone 15 Pro or plump for one of the iPhone 16 models. Last year, the iPhone 15 and iPhone 15 Plus were equipped with the A16 Bionic chip – the same chip that was in the iPhone 14 Pro models – whereas the iPhone 15 Pro and iPhone 15 Pro Max f