top of page
Writer's pictureMichael Dykier

Will Apple's Vision Pro Limit Video Access for AR Creators?

According to an article released by Upload VR a month ago, Apple's new Vision Pro AR/MR headset will not allow developers to deploy custom computer vision and machine learning algorithms that require camera video feed access to function.

apple's vision pro headset could pose challenge for third-party AR creators

One month ago a rumor surfaced that Apple will not allow 3rd party developers to use their computer vision algorithms in future Vision Pro apps. This is not entirely surprising, as many other VR headset manufacturers like Meta and Pico commonly restrict devs from accessing similar video feeds on VR headsets. However, this could be a significant setback for AR developers that commonly use such video access on mobile apps to deliver useful scanning, ML image processing, etc. in their AR/MR applications.


DAS Labs is very excited about Apple's Vision Pro's imminent release, and we hope that this rumor is false. Limiting access in this way would be a mistake on Apple’s part and could have negative effects on the industry as a whole. We’ll explain the reasons third party developers should retain this access from our perspective as AR developers with 9+ years of focused AR/MR experience.


1. Environmental Context:


AR/MR is only compelling and useful if the experience has "environmental context". Enabling passthrough video on VR headsets by itself will not enable significant AR/MR applications to materialize (see the Oculus Pro's failed release). While we do anticipate Apple's depth sensor generated data and some "out of the box" object classification capabilities to be available for Vision Pro developers, Apple's decision unfortunately will hinder the most unique custom AR apps and concepts from working on the Vision Pro headset on day #1.


The most impactful AR applications (on the current line of iPhones and on other AR headsets like the Hololens 2) often make use of custom CV algorithms that require access to video feeds.


Many of today's most promising and advanced AR projects are in the enterprise or advanced gaming space (because frankly, gaming consoles and VR headsets cover most other gaming opportunities at a lower price point today). Unfortunately, advanced gaming and enterprise AR projects, which encompass several of DAS Lab's current projects, will not work on the Vision Pro upon it's release because they require:

  1. tracking of specific human technicians working in radiological "Hot Zones" (i.e. locations with high radiological readings),

  2. tracking of player's basketball or golf balls for either collegiate or pro training and/or whimsical location based gaming,

  3. custom identification of surgical tools or insertion points used in operating rooms,

And much, much more...


2. IOS Devices already enable developers the ability to deploy custom CV/ML functionality:


According to Upload VR article, even iPhone devs couldn't access raw camera feeds until 2010 (a sort of arbitrary delay in hindsight). It's funny to consider that the Vision Pro's standard of Privacy isn't being applied to Apple's current 1.5 billion IOS devices that are out in the wild.


This will be a problem for the multitude of enterprise AR products and the most innovative AR games that use object tracking. You see, whereas simple AR headset gaming can be done with only a depth map, solid SLAM (device tracking), and hand tracking, a large volume of enterprise applications and the most cutting edge games leverage custom object detection IP that each individual company has invested in for (sometimes) several years on other AR/MR headsets or mobile IOS and Android platforms (see the above list of current DAS Labs' projects that will not work on the Vision Pro on day #1)..


In a sort of twisted reality, this means that the Vision Pro will be behind the iPad and iPhone in actual environmental context and unique app based capabilities. This is a tough pill to swallow when considering the enterprise as a market segment accounts for the majority of AR's current actually vetted use cases. This will impact gaming by restricting AR content from interacting with physical objects, like our partner project PuttScape (see further below).


3. Apple, as a purported leader in the AR space, should know better.


VR headset manufacturers would have permission to make this error because VR developers typically don't require "environmental context". Virtual Reality, while often lumped together with AR (for reasons I will not get into in this article), is a completely different paradigm to develop for. VR is inherently immersive, taking the user to another environment/reality for truly novel gameplay, training, etc. Even when VR headset manufacturers like Meta add pass-through camera capabilities, they miss the mark by not allowing developers to leverage these video feeds for novel interactions with real world objects.


Some argue that the standard "VR-first" headset manufacturer's (i.e. HTC Vive, Meta, Pico) "champion producers" are indie and professional game developers, training companies, etc. that choose the VR paradigm for its immersion factor. It can be argued that the VR-first manufacturers don't understand AR entirely, nor should they.


Most VR producers/developers have no interest or use for external video feeds, nor do they include, in their creative arsenal, the ability to develop and deploy CV or ML capabilities. Apple knows full well that even the smartphone app ecosystem evolved from a volume of 3rd party developers leveraging its suite of hardware and software capabilities, in unique ways. Now, AR developers that have innovated for the last several years on small IOS mobile screens, will not be able to bring their magic over to the Vision Pro.


4. The Glasshole Effect doesn't apply to the Vision Pro.


One of the commonly cited "AR glasses lessons learned" from those AR pioneers who wore the Google Glasses in public 10 years ago were that users were chastised at bars, grocery stores, etc. because other people thought they were secretly filming them (i.e. "privacy alert").


While time can normalize technology's intrusion into our daily life, we won't even argue AR glasses in public has been normalized yet. We will posit, though, that the negative attention Google Glass wearers received had more to do with its lack of utility and how it looked, rather than primarily the potential of secretly recording videos of others. Also, a distinction must be made: the Vision Pro is not meant to be worn in bars, grocery stores, etc. - unlike the Google Glasses, so this intrusion of privacy concern is misplaced. To draw your own conclusions on this point, checkout this "Glasshole" article from Wired.


AR Applications like Puttscape, according to this rumor, will not be deployable on the Vision Pro headset when released, but it already works on today's iPads and iPhones.


As if AR/XR/MR needs any more obstacles to overcome (we list these marketing acronyms as a rib jab). Having focused the last 9 years in this space, jumping on every new software and hardware release, hacking desperately around and through arbitrary limitations, many placed as IP ownership strategies by Big Tech, we hoped Apple wouldn't take the "easy route" and, instead of working through how to manage privacy concerns pre-release, hide behind "privacy concerns" as a buzz word.


The fact is the Vision Pro will either be compelling enough for people to risk being filmed, likely at their office or while gaming, just like we've accepted privacy tradeoffs to eagerly use email, social media, etc. OR it will fail to bring any utility to users (like the Google Glass) and flop just the same.


Releasing such an ambitious AR headset (a risky venture even for tech Titans like Apple) while telling the industries' most innovative developers to relegate their CV/ML capabilities to only iPads, iPhones and competitor AR headsets, is like releasing the iPhone in 2007 and telling everyone to keep using their iPods or Walkmans to listen to music.

Comentarios


bottom of page