I’ve been testing smart glasses for nearly a decade. One of the questions I was asked at that time was “Oh, but can you see anything in it?” Over the years, I had to explain that there was no such glasses.
This is no longer the case. While I have seen some glasses displayed over the past year, Meta Ray Light’s glasses feel the closest to realizing what so many people envision when they hear the term “smart glasses.”
To be clear, they don’t offer that possible immersive AR Orion prototype of the meta. In fact, Meon believes that “display AI glasses” are a category that is completely independent of AR. The display is on only one lens – to the right – with a 20-degree field of view much smaller than the 70-degree on the Orion. This sounds like a big compromise, but doesn’t feel like one.
A single monitor feels more practical for the pair of glasses you want to wear every day. This is something you can glance at when you need it, not always when covering it. Smaller sizes also mean the display is much clearer in size, with each metric of 42 pixels. This was especially noticeable when I walked outside with my glasses. Thanks to the automatic brightness feature, the image on the display looks more pronounced than in indoor light.
I also appreciate that when you look at someone wearing glasses, you don’t see any light from the monitor. In fact, when you get close to the close, there is almost no obvious display.
Having a smaller display also means glasses are cheap, at $799, and they don’t look like the chunky Ar glasses we’ve seen many times. At 69 grams, they are heavier than the second-generation meta-ray explosion, but not much. As someone who has tried too many thick black smart glasses, I’m glad Meta offers these colors besides black. The entire Wayfarer-style frame looks wide, but the lighter “sand” color feels more flattering.
Meta-ray display (left) and second-generation ray-type meta-glasses (right). The glasses are slightly thicker.
(Karissa Bell for Engadget)
The display glasses functioned almost the same as the monocyte bands that were almost the same frequency bands I used on the Orion prototype. It uses sensors to detect subtle muscle movements on your hands and wrists and can translate them into movements within the glasses’ interface.
This is hard to describe, but the gestures of navigation glasses interfaces work surprisingly. I can see how I can take some time to get used to the various gestures of navigating between applications, propose meta AI, adjust volume and other actions, but they are all fairly intuitive. For example, you slide your thumb along the top of your index finger, kind of like a d-pad, and then move up and down. You can increase and lower the speaker volume by holding your thumb and index finger together and rotating your finger right or left like a volume knob.
It’s no secret that Meta’s ultimate goal is to replace or almost replace the phone. This is impossible, but having an actual display means you can look at your phone less.
This displays text that can appear on the surface, with a navigation (for walking instructions) with a map preview, and information in the calendar. Unlike the live demos that Mark Zuckerberg tried during the keynote, I could also answer video calls from my glasses, which was better than I expected. Not only can I clearly see the people I’m talking to and their surroundings, but I can also turn on the camera of my glasses and see smaller versions of the video from me.
I also have the opportunity to try the Conversation Focus feature, which allows you to get live subtitles of the people you are talking to in a loud environment that may be hard to hear. The live subtitles of conversations with people who stand directly in front of me are a little surreal. When I talk to people, it’s really hard to see the screen, which almost feels good. However, I can also see that this can be very helpful for people who are in trouble or dealing with conversations. This is also suitable for translation, meta-AI has done well.
I also appreciate the wristbands that allow you to call Meta AI with gestures, so you don’t have to always say “Hey Me”. This is a small change, but I’ve been having a lot of conversations with Meta AI publicly. The monitor also introduces my long-term grip and ray element and another of Oakley glasses: framing photos is really hard. However, with the monitor you can see previews of the lens as well as photos afterwards, so you no longer need to snap up a bunch and hope for the best.
I only have about 30 minutes of glasses so I really don’t know how to show how it fits my day job. But even after a short time with them, they do feel like the kind of smart glasses that many people have been waiting for.