To me, one of the most interesting things when Apple comes out with a completely new class of device is seeing how they solve usability challenges peculiar to that type of product. For example, interactions with the Apple Watch touchscreen and side buttons have to take into account the fact the user will be covering most of the display with a digit.
That experience with creating new ways of doing things then appears elsewhere, for example the digital crown on the AirPods Max.
So, when the fabled Apple AR headset is finally revealed, it will be fascinating to see how the question of interacting with information and interface elements that will be presumably dangling in mid-air will be handled.
I think a fair bit of what one can see overlayed will be contextually chosen info and messages, rather than heavy UI, but surely somehow you’ll be able to make selections or dismiss things?
Imagine also, that the headset and its decedents aren’t just about directions, messages, and content consumption, but have a way for people to add hyper-localised information or tag web links to real world objects… Apple Maps is only the beginning. It’ll start with people holding up their iPhone, but the AR headset will be a lot more immersive. Add in what Apple has been doing with iBeacon/UWB (much more precise location data than GPS) and Memoji (overlaying on top of detected faces/objects), and there is definitely something exciting that Apple is working towards.
If the name of the first (high-end) headset is indeed “Reality Pro”, I think Apple eventually wants people to think “Apple Reality” when they think AR, not “Augmented Reality”.
Exciting times!
Currently listening: Pylon - “Cool (Extra)”