Skip to main content

Scanning QR Codes With Your iPhone Is About to Get Easier in iOS 17

When Apple releases iOS 17 in the fall, anyone who tends to use their iPhone's camera to scan QR codes can expect to find the process a little easier, thanks to a small but significant change coming in Apple's latest mobile operating system.


Apple first introduced iPhone Camera app support for QR code scanning in iOS 11. Back then, the URL link that the QR code generated would appear as though it was a push notification at the top of the screen.

Perhaps because this implementation was inelegant or confusing for some users, Apple in iOS 13 decided to redesign QR code scanning so that the link appeared as a yellow button within the camera viewfinder itself. However, in doing so, it created a new problem: The button would rove around in the viewfinder if the camera lens was also in motion, which made tapping it even more tricky than before.

Thankfully, in iOS 17, Apple has made another small, and this time welcome, change that improves the situation immeasurably. Now when you scan a QR code, the link button automatically appears at the bottom of the Camera interface. So instead of chasing the dancing link around the viewfinder, you can simply tap its fixed location above the shutter button.

In fact, there is a way you can get the QR code link to behave in a similar manner in iOS 16: As soon as you move the camera so that the QR code is no longer within the shot, it should drop to the bottom of the viewfinder and stay there. Still, when iOS 17 arrives later this year, you'll no longer need to perform this additional action to successfully tame an itinerant link.
Related Roundups: iOS 17, iPadOS 17
Related Forums: iOS 17, iPadOS 17

This article, "Scanning QR Codes With Your iPhone Is About to Get Easier in iOS 17" first appeared on MacRumors.com

Discuss this article in our forums



Source: TechRadar

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Here Are the macOS Sequoia Features Intel Macs Won't Support

When Apple released macOS Monterey in 2021, some key features required a Mac with Apple silicon. The same scenario played out with macOS Ventura in 2022, and then again the following year with the release of macOS Sonoma. With macOS Sequoia set to arrive in the fall, which new features can Intel Mac owners expect to be unavailable to them this time around? Apple says that macOS Sequoia is compatible with the same Macs as macOS Sonoma, but Apple's fine print reveals that certain new features won't work on Intel machines. If you're still on an Intel Mac, here's what you won't have access to. Apple Intelligence Apple Intelligence , a deeply integrated, personalized AI feature set for Apple devices that uses cutting-edge generative artificial intelligence to enhance the user experience, won't be available on Intel Macs. Apple says the advanced features require its M1 chip or later, so if your Mac was released before November 2020, you're out of luck. T

iPhone 16 Pro Models to Adopt 'M14' Advanced Samsung OLED Panels for Improved Brightness and Lifespan

The upcoming iPhone 16 Pro and iPhone 16 Pro Max will be the first Apple smartphones to adopt Samsung's high performance "M14" OLED display panel, claims a new report coming out of South Korea. According to ETNews , Samsung's "M" series of OLED panels are made for flagship smartphones, while "14" refers to the number of high-performance materials used to produce them. "M14" is the first series of its kind, and the panel is said to have been developed to deliver superior brightness and longevity. Samsung has reportedly placed orders for the M14 materials and is preparing to mass produce the displays in the second half of the year for Apple's iPhone 16 Pro models. Google's Pixel 9 smartphone is the only other device that is expected to adopt the high-performance displays in 2024. A previous report out of China claimed that this year's ‌iPhone 16 Pro‌ models will feature up to 1,200 nits of typical SDR brightness – a 20%

Apple Boosts A18 Chip Orders in Anticipation of High iPhone 16 Demand

Apple is said to have upped its order of next-generation chips from TSMC to between 90 million and 100 million units, following heightened demand expectations for its iPhone 16 series. Last year's initial chip order volume for the iPhone 15 series launch is believed to have been in the region of 80-90 million units, suggesting Apple is anticipating higher demand for its 2024 devices, according to Taiwanese outlet CTEE . The arrival of Apple Intelligence in iOS 18 is expected to boost initial sales of the devices. One of the reasons is that Apple Intelligence requires at least an iPhone 15 Pro to run, which means owners of last year's iPhone 15 and iPhone 15 Plus will miss out on Apple's new AI features unless they upgrade to an iPhone 15 Pro or plump for one of the iPhone 16 models. Last year, the iPhone 15 and iPhone 15 Plus were equipped with the A16 Bionic chip – the same chip that was in the iPhone 14 Pro models – whereas the iPhone 15 Pro and iPhone 15 Pro Max f