Home Computing The Future of AR Beyond the Vision Pro Is Already Brewing

The Future of AR Beyond the Vision Pro Is Already Brewing

I recently flew out to Long Beach, California, for the AWE augmented and virtual reality conference, but I left my mixed reality VR devices — the Apple Vision Pro and Meta Quest 3 — back in New Jersey. Instead I took two pairs of smart glasses: Meta’s Ray-Bans and Xreal’s Air 2 Pro. I took photos and made calls with the Ray-Bans. I watched movies on the plane with Xreal. And I didn’t miss those chunky VR goggles one bit.

VR headsets have never really caught on as massive mainstream devices, though their popularity for older kids, or for fitness or gaming, keeps creeping forward. A few problems they still have: They’re big, they’re heavy, and they mostly shut out the world. Recent VR headsets, like the Vision Pro and Quest 3, have started blending camera feeds of the outside world with VR to create mixed reality that feels like it’s overlaid onto reality, creating more of a connection to the world outside the goggles, but they’re still relatively big and awkward to wear.

High-tech glasses, on the other hand, look and feel more like the prescription eyeglasses I wear every day. Meta’s and Xreal’s glasses slip on and off and fold up to fit in little cases. They do very different things — the Ray-Bans are headphone and camera glasses without displays, and Xreal’s glasses are wearable displays with speakers and need to be tethered. These gadgets don’t offer up anything like the full-fledged mixed reality that can happen in the Vision Pro or Quest 3, but their increasing utility points to a future of augmented reality beyond bulky headsets. 

Watch this: The Future of AR Glasses at AWE 2024

I expect the next generation of Apple’s Vision Pro to arrive in the next year or two and be smaller, more affordable and possibly tether to a phone, MacBook, or iPad, based on the latest reports. Meta’s leap to AR glasses could take longer but will need to straddle a divide between the self-contained bulkier Quest headsets and the more limited, phone-connected Meta Ray-Bans. It won’t happen quickly, and I don’t expect the next Vision headset to truly look like glasses at all, but there’s a transition that’s slowly going to start happening for a new wave of less-large devices.

Meanwhile, AWE reminded me that better lenses, displays and hand tracking are coming but still face real challenges. How will future glasses offload all their processing? What about the battery? These devices must evolve, but exactly how they do that will depend on Google and Apple.

Meta Ray-Ban glasses next to Apple's Vision Pro headset Meta Ray-Ban glasses next to Apple's Vision Pro headset

Apple’s Vision Pro is the most advanced standalone mixed reality device, but it’s far larger than glasses (Meta Ray-Bans shown next to headset for comparison).

Scott Stein/CNET

Vision Pro is clearly the start of a new interface movement

Many developers and startup companies I talked with referred to the Vision Pro as defining a future user interface. Even though the Vision Pro’s hand-and-eye-tracking combination isn’t perfect, it’s the closest anyone’s come to finding a future controller-free interface.

The Meta Quest 3 and Vision Pro were scattered everywhere around AWE’s expo floor in plenty of peripheral and software demos. That’s because they both support hand tracking, and they combine camera feeds of the real world with overlays of virtual graphics to mix reality surprisingly well. 

Glasses haven’t perfected hand and eye tracking and mixed reality yet, but they probably will in the next few years. Many companies I met with are trying to crack these challenges. In the meantime, the Vision Pro now feels like the most advanced expression of the mixed reality future, with the Quest 3 right alongside it as a budget option. Then there are headsets like the Vive XR Elite, the Meta Quest 3 and the Magic Leap 2 standing in to develop new startup product ideas, too.

A hand pinching while wearing a tech wristband measuring movement A hand pinching while wearing a tech wristband measuring movement

TapXR’s little wrist device measures movement and finger gestures. Expect more watches to have this capability, too.

Scott Stein/CNET

New ways to track hand gestures in small devices

Bigger headsets like the Vision Pro and Quest 3 can already handle complex hand tracking, but glasses don’t have the battery life to pull off such tricks yet. I saw a few ideas for solving the bottleneck, using either new cameras or connected wrist wearables.

Doublepoint, a developer of touch interfaces, uses a software layer on Samsung watches to add motion control and pinch and tap gestures to your wrist. With it, I turned real lights on and off in a smart home environment, controlled a cursor and changed music tracks in a smart TV demo. I also saw people navigate mixed reality in a Magic Leap 2 headset with enhanced watch tracking. TapXR, another startup, has a camera-equipped motion-sensing wristband that also has tap and pinch gestures, and the company is working out tap-to-type systems on tabletops. These more gesture-rich wearables feel like the next step of what’s already becoming possible: Apple’s latest watch already has double-tap gestures, with more likely coming in the future.

A man with a mask wearing smart glasses with a few cameras on them A man with a mask wearing smart glasses with a few cameras on them

Ultraleap’s hacked-on extra motion-tracking camera on Meta’s Ray-Bans shows how low-power sensors could come on board smart glasses soon.

Scott Stein/CNET

Ultraleap, a company that already has hand-tracking technology on existing VR and AR headsets, is testing a smaller, more power-efficient event camera technology — which only senses rough changes in light and movement as opposed to specific details — that could last for hours on smaller glasses while looking for hand micro gestures, similar to what the Apple Vision Pro does with more power-hungry infrared. I tried a demo on a modified pair of Meta Ray-Bans that let me move my fingers to switch music tracks and tap to select songs: The idea is to have an interface that’s easier to use than the current touchpad used on the arms of Meta’s glasses, and that’s casual enough to be used while moving around.

Reports of Apple developing AirPods with their own cameras don’t sound so far-fetched, especially if those headphones use power-efficient event cameras similar to what Ultraleap was demonstrating. These cameras wouldn’t necessarily be able to take photos; instead, they’d help track the world so that hand gestures could work with or without glasses.

Holding a pair of normal-looking smart glasses up with clear lenses Holding a pair of normal-looking smart glasses up with clear lenses

AR waveguide lenses by Lumus, with prescription lenses added on, look normal in frames I checked out in person (the glasses still needed to be tethered for demos). The clear lenses can display big images in front of my eyes.

Scott Stein/CNET

Displays and lenses are almost there

I saw some lens and display demos that showed me that technically normal-looking smart glasses with big clear displays are possible, but it’s still unknown how affordable or power-efficient they’ll be. Lumus, a startup focusing on lens components, has its own clear and vivid reflective waveguides that are bonded with prescription lenses. I looked at some big and clear images in a few demos, using what looked like practically everyday glasses — though these glasses were permanently tethered to a nearby laptop to receive the images.

Avegant, another component maker, focuses on the display engines that power these glasses and project 3D imagery onto lenses like Lumus’. I looked at another prototype set of smart glasses — wireless and almost normal-looking — that projected onto waveguide lenses images that were bright enough for me to see while I was looking out of a bright hotel room window during normal daylight. 

A man with a mask wearing a pair of thick black smartglasses with a camera in the bridge A man with a mask wearing a pair of thick black smartglasses with a camera in the bridge

Avegant’s display tech put on prototype glasses can make for a small and nearly normal-looking set of frames.

Scott Stein/CNET

Waveguide technology uses tiny markings etched into lenses to mirror displays from side projectors into our eyes. Companies like Microsoft and Magic Leap use the tech in Hololens and Magic Leap headsets, but the method of making the lenses and displays is becoming extremely compact. Having a processor good enough to power those displays and lenses without being too bulky is still a work in progress: Magic Leap does it with a hip-worn processor, and the Hololens 2 is a full-head goggle setup nearly as big as a Meta Quest headset. 

These glasses need to connect to something, and that something is most likely our phones.

A laptop base with trackpad and keyboard with a pair of smartglasses attached and resting on them, called Spacetop A laptop base with trackpad and keyboard with a pair of smartglasses attached and resting on them, called Spacetop

Spacetop, a type of screenless laptop made to work with connected AR glasses, is a startup product imagining where computers will be when smart glasses become commonplace.

Scott Stein/CNET

Phones and computers need to evolve for glasses

The problem with current phones, however, is that they aren’t good at being friendly with most AR glasses’ needs. Apple locks down iPhones for anything more than basic display output, which is fine for wearable display glasses like the Xreal Air 2 Pro that connect via USB-C just like any other external monitor. Android phones, on the other hand, are fragmented in their capabilities. 

Xreal has its own AR-based ecosystem, for example, called Nebula, which runs via an app on Android phones to turn its glasses into true AR wearables. However, not all Android phones can run it well. To address the fragmentation, Xreal has made its own sort of phone specifically for its glasses: It’s called the Xreal Beam Pro, and it shows a possible road ahead for future phones.

Xreal glasses plugged into the Beam Pro, a phone-like device, with screen on Xreal glasses plugged into the Beam Pro, a phone-like device, with screen on

When will other phones work seamlessly with AR glasses? Seen here: Xreal Beam Pro, an almost-phone for Xreal’s products.

Scott Stein/CNET

The Beam Pro is a $200 Android device that runs Android apps like a phone would, but on Xreal’s glasses, like a seamless part of a multiscreen computer system, with multitasking and 3D AR modes, too. It has optional 5G wireless, so it can be used away from a Wi-Fi network. The Beam Pro also has a second passthrough charge port, allowing the glasses to stay connected while charging, and it also has widely spaced cameras for 3D “spatial” photo and video recording.

Qualcomm has made a case for phone-driven AR glasses for years, but Google hasn’t yet made this AR glasses connection a core part of Google’s Android OS. That may happen soon, though: Hugo Swart, Qualcomm’s former head of XR, recently joined Google. Meanwhile, Google also has an XR headset co-developed with Samsung and Qualcomm and expected to be announced in the next year, which could be the start of how Google Play will be linked between headsets and phones.

Watch this: Hands-On With Spacetop the AR Laptop

Apple could follow, when its next-generation Vision hardware finally finds a way to work with iPhones and iPads. Right now all the Vision Pro does with other Apple hardware is act as an extended monitor for Macs. Laptops should have an even deeper co-evolution with glasses, too, which is something that startup Spacetop — maker of another product I got to see at AWE — is exploring. The Spacetop is like a lidless Chromebook with a pair of Xreal AR glasses permanently attached; the glasses are the display for the laptop base, and they recognize the laptop base and track it through space, keeping the display centered like it’s a ghostly extension. 

Spacetop and the Xreal Beam Pro feel like early takes to solving how to power AR glasses with other computers. But eventually, everything we use should be able to connect well with glasses. It should be like the way headphones automatically pair with whatever we carry. And down the road, all these things should develop similar types of hand-tracking inputs through cameras, watches and other wearables — maybe even, as reports have been hinting, AirPods with cameras.

Next up: Expect slightly smarter glasses while phones work themselves out

AR glasses are still in the future, mostly because the phones we use aren’t ready to accept them yet — or the world around us. Safety concerns for public use haven’t been worked out, and location-based AR systems using map apps haven’t perfected everyday ways for shared experiences in specific locations to work without being disruptive. It’s also unclear whether phone batteries and processors are strong enough to power AR glasses without constantly overheating or needing recharges.

Right now next-gen mixed reality headsets will likely follow the Vision Pro and Quest 3 path, adding more AR features for home use. And expect glasses to gradually add more AI features, better cameras, audio and displays. Somewhere between the Meta Ray-Bans and Xreal’s glasses lies my perfect set of next-wave smart glasses, and I’d expect better versions to emerge in the next year, sooner rather than later. Glasses that may be ready to connect to our phones, but are waiting for our phones to grow up and evolve to work better with them, too.


 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment