Skip to main content

Facebook working on custom silicon, Face and eye-tracking for Quest 3 and 4

Facebook looks to ditch Android, Qualcomm, Apple, and Google in the future.

What you need to know

  • Facebook is looking to become less dependent on Android and Qualcomm as it develops its own AR/VR OS and chipsets.
  • Oculus Quest 3 and 4 could very likely feature some sort of eye-tracking or face tracking to enhance social apps by creating more realistic avatars.
  • Future Quest products should have significantly higher-resolution displays with HDR tech to better match how our eyes perceive reality.

An Interview with Facebook CEO Mark Zuckerberg revealed several details about the future of the Oculus Quest platform, including Facebook's quest to become less dependent on the other FAANG companies in Silicon Valley. The Oculus Quest 2 was the first step in utilizing a more VR-centric SoC from Qualcomm, whereas the original Oculus Quest utilized a similar chipset that was used in mobile phones. Facebook's goal, moving forward, is to focus on creating custom silicon and operating systems to help tightly optimize the experience for form factors that would be "socially acceptable," as Zuckerberg stated.

Those socially acceptable form factors include smaller and lighter hardware that would also run a lot cooler than the current Quest or Quest 2. While those headsets don't get noticeably hot for the user, Zuckerberg says that Facebook's desire to create significantly more powerful chipsets for use in AR and VR hardware means the company has to come up with new ways of cooling things down. In other words, they "don't want to burn people's faces off" as Zuckerberg so eloquently put it. Included with this custom silicon route is a custom operating system (OS) route that sees Facebook at the helm of control of everything, including ways to better optimize the experience than they currently can with Android.

In addition to that, Facebook is continuing to work in better and brighter displays that would more realistically recreate the world outside the headset. Current displays look mostly fine, but the goal is to create a "retinal display" that would make individual pixels imperceptible — something that's certainly not the case with current VR display tech. Even with a more powerful processor, though, Facebook will need to rely on what's known as foveated rendering in order to create sharper imagery without negatively impacting performance. They also continue to work on varifocal lenses, which is a tech that allows users to focus on different objects more naturally than they are able to with current static VR lenses.

As one would assume, Facebook's next user-facing focus is on the social experience with the Quest platform. This is being achieved by utilizing face and eye-tracking which, while that sounds incredibly creepy, is used to create more realistic avatars that have real expressions and can make actual eye-contact with someone else in VR. The idea is to create a realistic avatar that makes you feel like you're really in the room with another person rather than just having some goofy cartoon avatar. Zuckerberg gave a nod to Epic's recently released MetaHuman avatar generator in Unreal Engine, which you can see in the trailer above.

Zuckerberg also talked about spatial audio and its importance in heightening realism, as well as the concept of developing neural interfaces that are non-invasive, like an armband or something similar that you would wear but would be easily removable. The idea is to create more natural interfaces — not unlike hand tracking that's currently done on Quest — by using the body's ability to create additional neural paths, similar to how physical therapy works. They want to avoid something like Elon Musk's Neuralink, which really isn't something meant for consumers since, as Zuckerberg put it, would require consumers to "get their head drilled open."

Source: androidcentral

Popular posts from this blog

Apple and Meta Reportedly Discussed AI Partnership for iOS 18

Apple has held discussions with Meta about integrating the Facebook owner's AI model into iOS 18 as part of its Apple Intelligence feature set, according to a report over the weekend. Meta launched Llama 2, its large language model, in July 2023, and in April, the company released the latest versions of its AI models, called Llama 3 . The Wall Street Journal reports that the two longtime rivals have held talks about offering Meta's model as an additional option to OpenAI's ChatGPT. The paywalled report notes that the discussions haven't been finalized and could fall through. As part of Apple Intelligence, Apple has announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations. Using ChatGPT will be optional, so users with concerns about the technology can abstain and still make use of Apple's own new AI features. Speaking at WWDC 2024, Apple's

Apple Wasn't Interested in AI Partnership With Meta Due to Privacy Concerns

Apple turned down an AI partnership with Facebook parent company Meta due to privacy concerns, according to a report from Bloomberg . Meta and Apple had a brief discussion about a possible partnership in March, but the talks did not progress and Apple does not plan to integrate Meta's large language model (LLM) into iOS. Over the weekend, The Wall Street Journal suggested that Apple and Meta were in active discussions about integrating Llama, Facebook's LLM, into iOS 18 as part of Apple Intelligence. The report suggested that the discussions were ongoing had not been finalized, but Bloomberg 's follow-up indicates Apple never seriously considered a partnership. Preliminary talks happened at the same time that Apple began discussions with OpenAI and Google parent company Alphabet, but Apple decided not to move on to a more formal discussion because "it doesn't see that company's privacy practices as stringent enough." Apple did end up signing a d

iPhone 13 Pro vs. iPhone 16 Pro: 60+ Upgrades to Expect

The iPhone 16 Pro is set to succeed 2023's iPhone 15 Pro , introducing over 25 new features and improvements to Apple's high-end smartphones. With many users adopting three-year upgrade cycles, plenty of iPhone 13 Pro owners will be looking to upgrade to the ‌iPhone 16 Pro‌ later this year, so this guide breaks down every major difference you should be aware of between the two generations based on rumors. The ‌‌iPhone 13‌‌ Pro debuted in 2021, introducing a brighter display with ProMotion technology for refresh rates up to 120Hz, the A15 Bionic chip, a telephoto camera with 3x optical zoom, macro photography and photographic styles, Cinematic mode for recording videos with shallow depth of field, ProRes video recording, a 1TB storage option, and up to five hours of additional battery life. Three years later, the ‌iPhone 16 Pro‌ is expected to offer over 60 upgrades. All of the changes the ‌iPhone 16 Pro‌ models are expected to feature compared to their 2021 predecessors

macOS Sequoia Adds iCloud Support for macOS 15 Virtual Machines

Apple is introducing a notable enhancement to its virtualization framework in macOS Sequoia by enabling Mac users to sign into iCloud within macOS virtual machines (VMs). Previously, users could not sign into iCloud on macOS VMs, which limited the framework's utility for developers needing to test iCloud features and for users looking to sync their apps with iCloud. As spotted by ArsTechnica , macOS Sequoia removes that barrier, provided that both the host and guest operating systems are macOS 15 or newer. The feature will be available on Apple silicon Macs, but it has some limitations. Developers aiming to run older macOS versions alongside macOS 15 in a VM or those who upgrade VMs from older macOS versions will not be able to sign into iCloud on the VM. Only brand-new VMs created from a macOS 15 install image (an .ipsw file) can utilize iCloud and services related to Apple Account (formerly Apple ID). Apple's virtualization framework documentation explains : "Wh