Skip to main content

Apple Collecting Data to Improve Augmented Reality Location Accuracy in Maps

With the launch of iOS 17.2, Apple has outlined the Maps-related data that it is collecting in order to improve the augmented reality location function. In a new support document, Apple says that it is aiming to bolster the speed and accuracy of augmented reality features in the Maps app.


When using augmented reality features in Maps, including immersive walking directions or the refine location option, Apple collects information on "feature points" that represent the shape and appearance of stationary objects like buildings. The data does not include photos or images, and the feature points collected are not readable by a person.

According to Apple, Maps uses on-device machine learning to compare feature points to Apple Maps reference data that is sent to the iPhone. The camera filters out moving objects like people and vehicles, with Apple collecting just the feature points of stationary objects.

The comparison between the feature points and the ‌Apple Maps‌ reference data allows Maps to pinpoint a user location and provide detailed walking directions with AR context. Using either the AR Walking directions or Refine Location refreshes Apple's reference data to improve augmented reality accuracy.

Data that Apple collects is encrypted and not associated with an individual user or Apple ID. Apple also uses on-device machine learning to add "noise" to the feature points data to add irregular variations that prevent any "unlikely" attempt to use feature points to reconstruct an image from the data.

According to Apple, only an "extremely sophisticated attacker" that has access to the company's encoding system would be able to recreate an image from feature points, but because the data is encrypted and limited only to Apple, "an attack and recreation are extremely unlikely."

The use of AR data can be disabled to prevent Apple from collecting it. The "Improve AR Location Accuracy" toggle can be accessed in the Settings app by going to Privacy and Security and then tapping on Analytics and Improvements.
This article, "Apple Collecting Data to Improve Augmented Reality Location Accuracy in Maps" first appeared on MacRumors.com

Discuss this article in our forums



Source: TechRadar

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Apple Wasn't Interested in AI Partnership With Meta Due to Privacy Concerns

Apple turned down an AI partnership with Facebook parent company Meta due to privacy concerns, according to a report from Bloomberg . Meta and Apple had a brief discussion about a possible partnership in March, but the talks did not progress and Apple does not plan to integrate Meta's large language model (LLM) into iOS. Over the weekend, The Wall Street Journal suggested that Apple and Meta were in active discussions about integrating Llama, Facebook's LLM, into iOS 18 as part of Apple Intelligence. The report suggested that the discussions were ongoing had not been finalized, but Bloomberg 's follow-up indicates Apple never seriously considered a partnership. Preliminary talks happened at the same time that Apple began discussions with OpenAI and Google parent company Alphabet, but Apple decided not to move on to a more formal discussion because "it doesn't see that company's privacy practices as stringent enough." Apple did end up signing a d

iPhone 13 Pro vs. iPhone 16 Pro: 60+ Upgrades to Expect

The iPhone 16 Pro is set to succeed 2023's iPhone 15 Pro , introducing over 25 new features and improvements to Apple's high-end smartphones. With many users adopting three-year upgrade cycles, plenty of iPhone 13 Pro owners will be looking to upgrade to the ‌iPhone 16 Pro‌ later this year, so this guide breaks down every major difference you should be aware of between the two generations based on rumors. The ‌‌iPhone 13‌‌ Pro debuted in 2021, introducing a brighter display with ProMotion technology for refresh rates up to 120Hz, the A15 Bionic chip, a telephoto camera with 3x optical zoom, macro photography and photographic styles, Cinematic mode for recording videos with shallow depth of field, ProRes video recording, a 1TB storage option, and up to five hours of additional battery life. Three years later, the ‌iPhone 16 Pro‌ is expected to offer over 60 upgrades. All of the changes the ‌iPhone 16 Pro‌ models are expected to feature compared to their 2021 predecessors

Here Are the macOS Sequoia Features Intel Macs Won't Support

When Apple released macOS Monterey in 2021, some key features required a Mac with Apple silicon. The same scenario played out with macOS Ventura in 2022, and then again the following year with the release of macOS Sonoma. With macOS Sequoia set to arrive in the fall, which new features can Intel Mac owners expect to be unavailable to them this time around? Apple says that macOS Sequoia is compatible with the same Macs as macOS Sonoma, but Apple's fine print reveals that certain new features won't work on Intel machines. If you're still on an Intel Mac, here's what you won't have access to. Apple Intelligence Apple Intelligence , a deeply integrated, personalized AI feature set for Apple devices that uses cutting-edge generative artificial intelligence to enhance the user experience, won't be available on Intel Macs. Apple says the advanced features require its M1 chip or later, so if your Mac was released before November 2020, you're out of luck. T