Is the Quest 3 coming with great AI innovations?

[ad_1]

Is the Quest 3 coming with great AI innovations?

Picture: Meta

Der Artikel kann nur mit aktiviertem JavaScript dargestellt werden. Bitte aktiviere JavaScript in deinem Browser und lade die Seite neu.

Meta has developed a new AI model. If it finds its way into the Quest 3, the XR headset could become a real innovation.

Mark Zuckerberg spends vast sums to drive the development of the metaverse. For Meta, this includes AI systems. The latest innovation in this field is the Segment Anything (SAM) AI segmentation model. Read a detailed analysis in the linked article from AI-focused sister publication THE DECODER.

According to Meta, SAM “has a general idea of ​​what objects are and can create masks for any object in any image or video, even for objects and image types the system hasn’t encountered during training”.

SAM is versatile – even in XR

As a result, the model can be used in the future “for applications in numerous fields where any object in any image needs to be found and segmented.”

SAM’s design reportedly allows flexible integration with other systems: “In AR/VR, SAM could allow the selection of an object based on the user’s gaze and then ‘project’ it in 3D.”

This video shows the view through a VR headset with AR passthrough. Using SAM, it independently detects focused objects and people. Overlays show what exactly it is (at least if the detected object is also named) and how far away you are from it.

In Meta’s blog post about SAM, a similar video makes the recognition even clearer. Which headset was used for the clips is unclear. It could be a Quest Pro (review). Possibly this or similar technology will be available for the Meta Quest 3 (info).

SAM in Meta Quest 3: A game changer for XR?

This would expand the possible applications of the upcoming XR headset many times over. The headset could hlep visually impaired people by recognizing objects and distances and then announcing them via audio output. Or it could support language learning by displaying all recognized objects in the desired language.

Meta itself shows possible applications for AR headsets. For example, SAM could recognize baking ingredients on an AR headset and automatically display suitable recipes.

In addition, SAM could determine the nature of the physical world in the VR view and transfer it to the virtual world in a manner adapted to the VR software. This would eliminate the need to draw a guardian area, for example, and users would be able to move freely in virtual reality because it would correspond to the structure of physical reality.


[ad_2]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top