Update: June 6, 2023. Apple has announced Apple Vision Pro (hardware product, shipping 2024, $3,500) and visionOS, the platform, with video of the hardware and simulations of platform and feature previews. I feel like I got maybe 90% of this right, especially the emphasis on AR, the extensive use of 2-D windows, and even the use of the Oblong/Underkoffler term “spatial computing.” (Pats self on the back.) A lot of Apple rumor mill folks and hardware leaks talked about the hardware product in correct detail, but with curiosity and even confusion about why it would be compelling from a platform point of view, from a software and functionality point of view. If I may indulge myself, I feel pretty proud that I guessed what Apple saw as the platform potential even though most pundits were intrigued but perplexed, and offered not much enlightenment, unless they are former Oblong Industries employees like me. In which case they all work at Apple anyway (I know of three or four of my former coworkers, most of which are former Apple employees too). End of update.
(Original February 2023 prognostications:)
Apple’s new Reality OS hardware may become a reality in 2023, perhaps this quarter or early next quarter. With many credible rumors flying, a lot of smart people have given what I think are sort of superficial first-order thoughts about Apple’s entrance into the headset market. Apple did things differently with the iPhone and it changed the world and changed Apple. Will Apple just enter the headset space and do the same things as other players?
What is Possible?
What we know about the current state of the art:
It is important to understand just how difficult headset technology is, from a processing power, weight, specifications, and battery standpoint. It is all very non-trivial.
What we know:
- Apple has custom low-power silicon expertise that could power a portable headset device, probably outpacing the capabilities of any competitors. Apple can sell you a 14” laptop with 96 GB unified memory, which means more VRAM than most desktops and laptops!
- Apple has camera expertise and entire teams that have built sophisticated pipelines. They are not new to this.
- Apple has already shipped AR demos in their iOS and iPad devices, which have LiDAR sensors. They have solved some tracking (keeping an object in place) and even occlusion problems in existing software. However, when you are using your phone to change the POV, you can’t use your hands to interact. Apple needs hand tracking, and there is probably no reason Apple can’t do this well.
- Apple has global map data for their Apple Maps product which could be useful in software that interacts with the real world. Apple has shipped GPS for years now, which tell where you are in the real world at a global scale, down to the building you are in.
- Apple has beacon technology that can tell devices where objects are in the real world, at the room or building or meter-proximity scale. And near-field expertise to tell when devices touch things in the real world.
- Apple has shipping technology that can track the face and expressions and deform a 3D avatar, in the form of the otherwise pointless Memoji initiative. This tech would be far from pointless in Reality OS.
- Spatial computing has been around in research labs and in Sci-Fi for a long time (see Technologies in Spielberg’s 2002 film Minority Report. Underkoffler would later consult on the Marvel Iron Man film and found Oblong Industries. I worked at Oblong from 2010 to 2012.) A lot of these gestures and multi-screen and real-world computing ideas have had decades to stew, so this not actually a totally new software world, even if the specialized hardware hasn’t yet stuck. There is good prior art here.
- Apple has deep content plays with existing content and pipelines producing more (Apple TV+, Apple Arcade, Apple Fitness+) that could be compelling in a Reality OS world. They don't have to wait for others to make compelling content for this space. They could make it themselves.
- Apple FaceTime is a well-established video calling system that consumers trust and know how to use. Apple recently introduced SharePlay for remotely working with apps and data while sharing presence. It is half-baked but shows where they want to take things.
- Apple is good at creating new UI paradigms by taking the best of existing (often external) tech demos and turning this into a coherent blending of hardware and software that proves useful and intuitive to people. You might even say that this is Apple’s sine qua non.
Reality Gambit, or Reality Gamut?
I agree with a lot of the pundits that do not have a strong desire to strap a computer to their skulls, unless the advantages are worth it.
In my view, the holy grail might need to be a device that can
- Show the real world in high fidelity with no virtual pixels intruding. Bonus points if legacy systems are usable without taking the headset on and off. (More about this below, under Rectangular Proxies.)
- Show overlays of useful information superimposed on top of the real world, such as floating marquees showing contextual information or directions (Augmented Reality). The user(s) need to be able to interact with this data, avatars, creatures, and whatnot, probably with hand gestures and voice (Siri).
- Allow the option for a fully virtual experience that can completely block out the outside world for meetings, gaming, relaxation, movie watching, and future blending of these activities. (With outward-facing depth sensors that switch the software back to showing external cameras when danger is imminent, as current VR headsets do.)
I think if Apple can’t create a product and a platform that can span this entire spectrum or Reality Gamut of Mixed Reality = AR + VR, then I think any Apple headset product is dead in the water. We need more from Apple than just AR or just VR to justify a new software platform. But I think this full Reality Gamut could be enough to carry one new device category.
No way will Apple debut in 2023 with lightweight “glasses” that are transparent yet can do all of the above. Not with current technology. They know this is five to twenty years off.
Yet no way would Apple not understand the long term goal: to subsume all computing into one platform or paradigm that can link the old portable and desktop and large-screen paradigms (watchOS, iOS, iPadOS, macOS, tvOS) with a new pervasive computing paradigm, which would eventually require “magical” non-existent spectacles to be long-term viable for most people.
Yet, if Apple can launch an expensive XR (truly mixed gamut of normal reality to AR to fully immersive VR) headset in 2023—with some drawbacks like (a) weight, (b) battery and (c) price, but no compromises on tracking, interaction, lag, etc.—then they can start building the future now. In other words, developers could start writing software this summer, in 2023, that might run with a similar experience on a “magical” spectacle headset in the future, decades hence. I believe this will be their strategy. Most importantly, the interaction paradigm will not need to change even if the hardware changes to allow transparent views of the actual real world instead of cameras projecting the real world onto screens. If the platform is conceptualized and built right, up front, the same software might run on future more-advanced hardware.
Imagine what you could build with just that: a pair of iPhones strapped to your head. You would see out of the high-quality iPhone camera(s) with each eye looking at an expensive, high-dynamic-range, high-resolution “Retina” display or displays. The system could then superimpose pixels with UI and content experiences. Existing LiDAR sensors and AR software can run occlusion so things appear in the real world where they ought to be, especially behind objects. And you could opt into a fully immersive environment for VR if your surroundings were safe, like the “I’m not driving” button in iOS. And mixed experiences, like turning your floor into lava, would make for pointless but powerful demos that would at least get the point across.
For some reason I have not seen tech journalists and pundits frame the headset Gordian Knot the following way (perhaps their jobs are to critique existing things instead of figuring out how to shift the future using logic and creativity?):
- Apple only needs to solve the software platform paradigm problem up front, while purposely embracing hardware tradeoffs (a), (b), and (c) (above), and commit to making gradual headway on the hardware going forward, while building the app and content experiences that will become increasingly accessible as (a) weight decreases, (b) battery-life improves, and (c) prices drop.
Apple did exactly this with the watch, the phone, the tablet. The original iPhone is terrible by today’s standards, but got the software off the ground. That software still runs in the same paradigm, sixteen years later. The original iPad is terrible by today’s standards, but that iPad was leaps-and-bounds better than anything on the market at the time (and existing, modern iPads are still way better than anything on the market). The original Apple Watch barely did what it needed to do, but allowed the Apple Watch to start off as a platform. The current watches for sale are almost the exact same in terms of software paradigm, as that original Apple Watch, just much better in terms of form factor, battery life, etc.
I will speculate that Apple could bring legacy software into the Reality OS world by allowing virtual devices of various sizes, from wearable (wrist) to pocketable (phone-sized) to holdable (tablet-sized) to ergonomic desk-like, to movie-theater or fully immersive. My working idea is that they could ship “foam block rounded-rect proxies” with sexy fiducial markers (Apple have done similar with the experience that allows transfering iPhone data to set up a new device, which is way, way sexier than it needs to be). These holdable objects could be interacted with by projecting pixels of virtual screens running legacy apps onto the “devices” as seen from the headset, which would function very similar to current devices including touch, hopefully. I am not sure about the details for this, but they have brought iPhone and iPad software to the Mac once it ran Apple Silicon, so they understand the value of bringing software from one platform to another. They could bring 2D computing into Reality OS, out of the gate in 2023.
I recognize that there are significant challenges to creating a new immersive computing paradigm. I think Apple has a good shot at this, if they take it seriously and have thought deeply about it from a user- and a design- and a “how-things-should-work” perspective. They have done it before, with the Apple II, Mac, iPod, iPhone, iPad, and Apple Watch. I am pretty sure that this new category will not be initially as popular as these six past categories, but I believe eventually it could be more popular. Immersive computing could be the future. I think it is just a matter of when and how.
(And I would rather Apple define that future than Meta—which is just Facebook, a company designed entirely to sell ads through rage-inducing “engagement.”)