The Pieces for Apple’s Glasses Are Already Here

Smart glasses have a moment. Latest Meta News Ray-Ban specifications are available now, the latest Oakley models are coming soon, and Google and Samsung are likely to enter the tech race for your face next year Android XR. With all these dynamics, it always seemed inevitable to me that Apple will introduce its own smart glasses sooner or later—and recent signs indicate it will happen sooner.

Reports claim that Apple has suspended production of its Vision Air hardware—a smaller, lighter successor to its existing Vision Pro VR headset—in favor of smart glasses. In my opinion, this looks like a turn to compete with the wave of AI glasses that are all from Meta and from Samsung to Google, ClickAmazon, Xreal, Rokid, and even OpenAI are either selling, developing, or rumored to be exploring.

A pair of Meta Ray-Bans over an Apple Vision Pro headset.

Apple doesn't have Ray-Ban-style smartglasses yet, but the Vision Pro has already begun to explore the boundaries of XR on our faces.

Scott Stein/CNET

How Testing various smart glasses This fall, I see Apple getting its act together. The company already has a catalog of products and wearable technology that will make waves, and it's further along than you might imagine. Here's how Apple's current headphones, watches, phones and software might affect its first pair of smart glasses.

Audio technology via AirPods

Apple has been working on technology for our faces for over a decade. When I wore the first AirPods back in 2016 and they were ridiculed for how weird they looked, it felt like Apple was testing flexible design for our faces. It worked: today everyone wears AirPods and other wireless headphones, and don't be ridiculed for it.

090716-apple-airpods-music-7021-2.jpg

Once upon a time, having them in your ears was amazing. Look where we are now.

Scott Stein/CNET

Since then Apple has released computational audio functions which fits perfectly into smart glasses. Think live translation The latest AirPods firmware includes head nodding gestures for quick responses, heart rate tracking, ambient noise filtering for focusing or aiding hearing loss, and 3D spatial audio. There's also new open-ear noise-canceling technology. AirPods 4plus FDA approved hearing aid – a feature already appearing in smart glasses from companies such as Nuance.

All of these technologies could be applied to augmented reality smart glasses, which have tiny open-air speakers built into their frames to produce sound. AirPods may just be the beginning.

Photo of two hands: on one - Apple Watch, on the other - Meta Neural Band.

My wrists last week: On the right is a neural bracelet for controlling Meta Ray-Ban displays. On the left is the Apple Watch, which already has several gestures, but does not yet have glasses to control them.

Scott Stein/CNET

Control technology via Apple Watch

Meta's newest display glasses feature a Neural Band, which controls the display on the lens using electrodes that read tiny muscle impulses and turn them into gestures in the air. Apple already has its own wrist gesture controls.

apple watch already supports double tap and shake to dismiss gestures to quickly reply to messages, answer calls, or stop timers. I was impressed by how early double-tap was on the watch and immediately thought about how naturally it could connect to VR and AR headsets.

Apple glasses can also connect directly to the watch for quick access to on-screen readings, allowing them to bypass the built-in display altogether. Think of them as a viewfinder for camera glasses or a portable touchscreen for selecting connected apps. Meta has already hinted that its Neural Band could bend to become a clockand Google has plans for watch and glasses intersecttoo much.

apple-iphone-air-17-1

iPhone cameras continue to pack more into smaller frames. Next: glasses?

Joseph Maldonado/CNET

Camera technology via iPhone Air (and Vision Pro)

Apple has a long history of being able to fit high-performance cameras into small spaces. Super slim iPhone Air The most impressive compression has been achieved this fall, and the glasses require even smaller cameras.

Apple already has experience installing cameras and other sensors on headsets. Vision Pro Camera Suite probably a lot more sophisticated than anything Apple glasses have.

And Apple could also borrow from its existing controls. iPhone Camera button already has a capacitive touch sensor that can tell him how his glasses can be moved using the frame's temple.

Perhaps Apple will add stereo 3D recording, allowing you to shoot spatial video on the glasses so you can relive it later with the Vision headset. It's the same fantasy of recording your memories that Vision Pro tried to sell with headset recording.

Apple needs to up its visual AI game

Apple glasses will need AI services with camera support, like the iPhone Visual Intelligence. There is still a lot of work to do to catch up with Google Gemini and Meta AI. But glasses could be an ideal place to introduce this technology, and perhaps eventually even teach AI models what they're capturing.

As with Meta, solving the AI ​​problem in glasses could lead to improved AI in other Apple projects in the future, such as cars.

Apple Stores are ideal for displaying glasses

Meta is working on creating a retail experience to showcase its new Display glasses, but Apple already has a global fleet of stores – the same ones it used for elaborate tech demos during the Vision Pro launch. Apple stores would be ideal for glasses, with prescriptions available online, as Vision Pro already does with lens partner Zeiss.

Xreal One glasses connected by cable to iPhone 15

Linking a pair of Xreal One glasses to an iPhone. Smart and display glasses now work with phones, but their connectivity needs to be even better.

Scott Stein/CNET

Improving phone connectivity is an Apple specialty

Existing smart glasses don't do the job of connecting to phones and app stores, and Apple can solve this problem as well as anyone. Because Google and Apple control the communications links to phone operating systems—Android and iOS—glass makers have the power to create connections that allow phones, smartwatches and other devices to work together seamlessly.

Meta's glasses must work through the phone app and be disconnected from Siri and Gemini services. Android XR from Google should help expand the glasses' connectivity options on Android, and Apple should do the same on iOS. Making Apple's own glasses could also pave the way for better support for other brands, or encourage iOS app developers to think about glasses in general.

We probably won't know anything for sure about the debut of Apple glasses until next year, so this is just speculation for now. But put all the parts together and you can imagine some pretty special characteristics. Now Apple just needs to put them on my face.

Leave a Comment