Press "Enter" to skip to content

Meta Ray-Ban Glasses for Vision Loss

Ray-Ban Meta glasses with a grey tint lens folded over the cover
Meta Ray-Ban glasses and cover

In the small group of professional certified vision rehabilitation therapists (CVRT), to which I belong, it is not uncommon for us to use the term “magic glasses.” These are the glasses some of our clients hope to find that will improve or restore their vision. It makes sense, in that for many of them, a new prescription, at one time, brought things back into focus. Vision loss from macular degeneration, glaucoma, diabetes, etc. is usually something that can be managed with proper care, but vision is not usually restored to what it was previously. But the myth or hope for the magic glasses persists.

Take a pair of glasses, like the Ray-Ban Meta Glasses, add artificial intelligence AI to them, and it’s easy to see how these can only add to the hope that these might finally be the magic glasses those of us with acquired low vision have been hoping for. In my day job, I’ve received a lot of calls about the Meta glasses, where to buy them, and what they’ll do.

Let’s begin with a bit of a reality check on these glasses. The Ray-Ban Meta glasses were designed for social media users (recall that Meta is the new name for the company that owns Facebook), to add pictures and videos to their social media accounts using a stylish pair of glasses equipped with a camera and speakers. Much of this accomplished with a digital assistant, like Siri or Alexa summoned with the microphone built into the glasses (“Hey Meta!”). So, it’s important to note that the Meta glasses were not designed for low vision or blind users. The fact that some of the features on the Meta glasses are handy for those with a vision loss, is purely by chance, a wonderful bit of synchronicity. This also explains why, compared to the other smart classes specifically designed for the visually impaired, they are so reasonably priced, starting at less than $300, and often available locally, at places like Target or Lens Crafters. Another bonus to them not being designed specifically as assistive technology, is that they are stylish and customizable. The frames are well-built, look good, and prescription lenses can be added to the frames with a wide variety of glare filters for the lenses. In addition to the tech features, these glasses can have some very functional lenses built right in.

How Do the Meta Glasses work?

One of the other features that attracts a lot of interest in potential users is learning that you talk to the glasses and get information with voice prompts, the way you might use one of the digital assistants, like Alexa or Siri. While this is true, it’s important to note that the glasses are used when connected to a smartphone or tablet. The Meta glasses themselves, are a handsfree place to put a camera, speakers, and microphone—all the AI stuff takes place on the smart device in your pocket, backpack, or purse.

As a result, using the Meta glasses begins with downloading the free Meta AI app from the App Store, or the Google Play Store, installing it on your phone or tablet, then connecting the Meta glasses by Bluetooth to your smart device.

The Meta glasses themselves have a built-in rechargeable battery, and a pretty ingenious charging method. The case for the Meta glasses, is itself, a charging station. The case is charged using a USB-C cable. Whenever the glasses are put into the case, they are recharged from the case. So once the case is fully charged, it will recharge the glasses numerous times before it needs to be reconnected to a wall outlet. When charged the Meta glasses will run for about 4 hours of moderate use, and about 5 hours of continuous audio playback from streaming music or podcasts.

The AI Magic

Many of us are already familiar with AI, or artificial intelligence. We use it whenever we ask Siri, Alexa, or Google Assistant a question. Apps like Seeing AI, Google Lookout or Be My Eyes all use AI to identify objects or color, recognize text, and describe the environment. The Meta glasses use the Meta AI to process voice prompts or images from the camera, located on the top left side of the glasses. Once the glasses are paired and connected by Bluetooth, users can use voice prompts to interact with the glasses. For example, “Hey Meta, what time is it?” “Hey Meta, what’s the weather today?” More importantly for the user with a vision loss, the glasses can be prompted to look and describe or look and read. For example, “Hey Meta, look and describe what’s in front of me?” “Hey Meta, look and read what’s in my hand?” In each case, the glasses will respond, with information. In addition, follow-up questions can be asked related to the response. For example, if Meta responds that there is a car in front of you, you might then say, “What color is the car?” or, “What model car is it?”

Meta seems to have a strong impulse to summarize text it reads. While this can be handy to get the basics, if sorting mail or something like that. If, however, the goal is to read an entire document, the voice prompt, “Hey Meta, look and read every single word.” This does the trick in most cases.

One place where summarizations are really handy is reading a restaurant menu. For example, rather than listening to an entire menu read from start to finish, Meta can quickly look to see if there are certain items on the menu, or give a summary of the sandwiches, desserts, and even the price range, depending on the follow-up questions. For example, “Hey Meta, is there a Caesar salad on the menu in my hand?” “Hey Meta, what desserts are on the menu in my hand?” For this writer, it is truly the reading and quickly searching a menu, that actually conjured up the notion of “magic glasses.”

Hallucinations or Misbehavior

The folks that know a lot about AI will tell you that there are times when AI offers what they call, “hallucinations.”  And this is often said with a chuckle, like it was the misbehavior of an impulsive child. The Meta glasses are not immune. Users can usually give instances of when the AI  insisted on a summary instead of all the text, made something up, or provided less than accurate information. For example, asking if there was a lamp in the room, Meta responded, “Yes, there’s a lamp on the table next to the bed.” When asked, it told me it was several feet away. When asked if the lamp was straight ahead, Meta responded, “The lamp is slightly to the left.” I asked the question a second time and received the same response. In fact, the lamp was on the right of center.

All things considered, the fact that these objects were described in the room with their approximate distance, is really helpful, but the point is that Meta AI is not yet at the place where it is going to be able to provide basic navigation directions or replace tools like a white cane for orientation and mobility. While the processing speed is really pretty quick, it is not instantaneous enough or accurate enough for something like street crossings.

A Brief Tour

As mentioned earlier The Meta glasses are equipped with the features you’d expect for creating or consuming media content. A camera is positioned over the left lens. Inside the left arm, next to the hinge, is the on/off switch. Push the switch forward for on and back for off. Inside both arms of the glasses, near the ear is a speaker with rather good sound quality. On the outside of the right arm is a touchpad that performs several functions. Sound volume can be adjusted with a swipe forward or back, phone calls can be answered with a tap, etc.  On the top front of the right arm is a small button that when pressed will take a picture. Long press and a video will be taken—press again to stop recording. Lastly, over the right lens is a small light that signals when a picture is being taken or a video recorded, so those with vision will know when to smile as they are being photographed.

Integrated Apps

Because Meta AI is a product of Meta, it interfaces primarily with other apps in the Meta sphere, like WhatsApp, Messenger, Spotify, etc. Notable exceptions to this, and perhaps concessions to users with a vision loss, is the ability to interface with Be My Eyes and AIRA. Both apps can be connected through the Meta AI app and work well with the glasses, for hands-free use. The one notable exception is the Be My Eyes AI function. When using the app with the Meta glasses users are limited to calling a Be My Eyes volunteer.

Final Thoughts

Ray-Ban is often thought of as a company that makes sunglasses. I discovered that the Ray-Ban Meta glasses used for this review had a great polarized, grey gradient tint on the lens, which made them terrific sunglasses even without the smart features. Most of the time I wanted to use the smart features were inside a business or home, for reading. I think for many, a pair of clear lenses or a light tint for indoor glare would be ideal with maybe some clip-on glare-filters for outside. Flip the clip-ons up when taking a picture.

The camera on the left side took some getting used to, less for reading than for taking pictures. I always seemed to be missing a portion of the picture a bit to the right. Additionally, I often forgot to take off my hat, and the brim was often in my first pictures.

I confess, there were times, usually when using the Meta glasses for reading that I too thought of them as magical, with the notable exception of the on/off switch that always just seemed hard to find or use with a fingernail. For many, however, who might think the AI extends to navigation, or immediate descriptions for things like street crossing will be disappointed at the current limitations. In addition, a number of people I spoke with assumed that because the glasses use voice prompts, it alleviated the need to learn how to use a smartphone. Once the Meta AI app is downloaded and the glasses connected to Bluetooth, it’s possible to just use the glasses by using voice prompts that are fairly conversational but there is still the need for some basic knowledge and comfort with the smartphone or tablet which runs the Meta AI app.

The Meta Ray-Ban glasses are an accessibility tool as an afterthought, which makes them more affordable and, in many ways, oblivious to the needs of users with reduced vision. For the price and convenience, they offer a great deal in terms of object recognition, description of surroundings, and reading with optical character recognition (OCR).

Ray-Ban Meta AI glasses start at $299 (Meta Glasses at Amazon for $224) . The Meta AI app is a free download from the App Store or the Google Play Store and will run on iOS14.2 and above and Android 10 and above. Meta, does offer accessibility support by phone at 855-592-2237.

 

Comments are closed, but trackbacks and pingbacks are open.