Smart glasses are having a moment. Meta’s latest Ray-Ban specs are available now, its newest Oakleys are coming soon and Google and Samsung will probably jump into the tech race for your face next year with Android XR. With all that momentum, it always felt inevitable to me that Apple would introduce its own smart glasses sooner or later — and recent signs point to sooner.

Reports claim that Apple has paused its rumored Vision Air hardware — the smaller, lighter successor to its existing Vision Pro VR headset — in favor of smart glasses. To me, that sounds like a pivot to compete with the wave of AI-powered glasses that everyone from Meta and Samsung to Google, Snap, Amazon, Xreal, Rokid and even OpenAI are either selling, developing or rumored to be exploring.

Apple doesn’t have Ray-Ban-style smart glasses yet, but the Vision Pro has already begun exploring the XR frontiers on our face.

Scott Stein/CNET

As I test various smart glasses this fall, I see the pieces coming together for Apple. It already has the product catalog and wearable technology in place to make a splash, and it’s much further along than you might realize. Here’s how Apple’s current headphones, watches, phones and software could shape its first pair of smart glasses.

Audio tech via AirPods 

Apple’s been working on tech for our faces for over a decade. When I wore the first AirPods back in 2016 and got mocked for how weird they looked, it felt like Apple testing a design flex for our faces. It succeeded: Today, everyone wears AirPods and other wireless buds — and don’t get mocked for it.

A long time ago, having these in my ears was surprising. Look where we are now.

Scott Stein/CNET

Since then, Apple’s been unleashing computational audio features that could fit perfectly into smart glasses. Think live translation in the latest AirPods firmware, head-nodding gestures for quick replies, heart rate tracking, ambient noise filtering to sharpen focus or assist with hearing loss and spatial 3D audio. There’s also the new open-ear noise cancellation tech on AirPods 4, plus FDA-cleared hearing assistance — a feature already popping up in the smart glasses from companies like Nuance.

These technologies could all apply to smart AR glasses, which have tiny open-air speakers built into the frames for audio. AirPods could be just the beginning.

My wrists last week: On my right, a Neural Band to control Meta Ray-Ban Displays. On the left, an Apple Watch that already has a few gestures onboard, but no glasses to control them yet.

Scott Stein/CNET

Control tech via Apple Watch

Meta’s newest display glasses come with the Neural Band, which controls the on-lens display using electrodes to read tiny muscle impulses and turn them into in-air gestures. Apple already has a foot in the door with its own wrist-based gesture controls.

Apple Watches already support double-tap and shake-to-dismiss gestures to quickly reply to messages, answer calls or stop timers. I was impressed by how early double-tap showed up on the watch and immediately thought about how naturally it could connect with VR and AR headsets.

Apple’s glasses could also link directly to the Watch for quick access to on-screen readouts, allowing them to skip a built-in display altogether. Think of them as a viewfinder for camera glasses, or a wearable touchscreen for selecting connected apps. Meta’s already hinted that its Neural Band could possibly flex to become a watch, and Google’s got plans for watches and glasses to intersect, too.

The iPhone cameras keep packing more into a smaller frame. Next up: glasses?

Joseph Maldonado/CNET

Camera tech via iPhone Air (and Vision Pro)

Apple’s an old hand at shrinking high-performance cameras down into small spaces. The super-thin iPhone Air pulled off the most impressive compression yet this fall, and glasses demand even smaller cameras. 

Apple has experience putting cameras and other sensors on headsets already. The Vision Pro’s array of cameras is likely a lot more complex than anything Apple’s glasses would include.

And Apple could also borrow from its existing controls. The iPhone’s Camera Button already has a capacitive touch sensor, which could hint at how its glasses might navigate using the arm of the frames.

Maybe Apple will add stereo 3D recording, letting you capture spatial videos on the glasses to relive later with a Vision headset. It’s the same record-your-memories fantasy the Vision Pro tried to sell with its in-headset recording.

Apple needs to up its visual AI game

Apple’s glasses are going to need camera-aware AI services, like the iPhone’s Visual Intelligence. There’s still a lot of work to do to catch up with Google Gemini and Meta AI. But glasses could be the perfect place to introduce that tech, and maybe even train AI models on what they capture over time.

Much like Meta is doing, solving for AI on glasses could lead to better AI in other Apple projects down the road, like cars.

Apple Stores are a natural fit for glasses demos

Meta is working to build retail experiences to demo its new Display glasses, but Apple already has a global fleet of stores — the same ones it used for the complex tech demos during the Vision Pro launch. Apple stores would make perfect sense for glasses fittings, with prescriptions filled online, like Vision Pro already does with lens partner, Zeiss.

Tethering an Xreal One pair of display glasses with an iPhone. Smart glasses and display glasses work with phones now, but they need to be able to connect even better.

Scott Stein/CNET

Connecting better to phones is Apple’s speciality

Existing smart glasses fall short when it comes to connecting with phones and app stores, and Apple could solve that problem as well as anyone. Since Google and Apple control the pipelines to phone operating systems — Android and iOS — glasses makers are at their mercy to build the connections that make phones, smartwatches and other devices work seamlessly together.

Meta’s glasses have to run through a phone app, and get cut out of Siri and Gemini services. Google’s Android XR should help deepen glasses connections on Android, and Apple needs to do the same on iOS. Apple making its own glasses could also pave the way for better support for other brands — or encourage iOS app developers to start thinking about glasses in general.

We probably won’t know anything for sure about Apple’s glasses debut until at least next year, so for now it’s all just guesses. But put all the pieces together, and you can imagine some pretty special specs. Now Apple just needs to put them on my face.



Read the full article here

Share.
Leave A Reply

2025 © Prices.com LLC. All Rights Reserved.
Exit mobile version