Disclosure: Some links on this page are monetized by the Skimlinks, Amazon, Rakuten Advertising, and eBay, affiliate programs, and Liliputing may earn a commission if you make a purchase after clicking on those links. All prices are subject to change, and this article only reflects the prices available at time of publication.

Facebook’s parent company Meta has been partnering with Ray-Ban to make smart glasses since 2021 (you know, before Facebook started calling itself Meta). But they’ve been a niche device that probably has limited appeal.

That… may or may not be about to change. Meta has unveiled a new version called Ray-Ban Meta Smart Glasses that are available for pre-order for $299 and up from Ray-Ban or Meta. They should be available beginning October 17th.

Meta says the new glasses bring better cameras, improved audio, an updated processor, and support for live streaming video (rather than just recording video to share later). And maybe one of the biggest selling points? Ray-Ban Glasses will be the first smart glasses that ship with Meta’s new AI capabilities built-in.

You can trigger Meta AI by saying “Hey Meta” to start a conversation with the company’s AI assistant using your voice. And next year Meta promises to add “multi-modal” AI feature that allow the AI to respond to information in your environment.

For example, you could look at a building and ask Meta AI what it is (hello Google Lens). Or ask it to translate a menu into a language you understand (like Google Translate). And perhaps most intriguing (because Google doesn’t really do this yet), you could look a broken sink faucet or something else and get step-by-step instructions on how to repair it while you work.

While introducing the new glasses today, Meta CEO Mark Zuckerberg admitted that he had expected the killer app that would eventually make smart glasses and mixed-reality headsets like the Meta Quest 3 really take off would be holograms that can superimpose virtual objects on real-world environments. But that’s still very much a work-in-progress technology that has yet to really catch on.

He says the rapid advance of generative AI models, though, has presented a new opportunity for wearables that might be just as important. While you could theoretically do all of the things mentioned above with a phone or other devices, a wearable device like there smart glasses could make everything feel simpler and more natural than sticking a phone or VR headset between your face and the rest of the world.

Or you could just use the Ray-Ban Smart Glasses as wearable cameras. While the original Ray-Ban Stories glasses had dual 5MP cameras, the new ones have 12MP ultra-wide cameras for higher-quality photography, support for capturing up to 60 seconds of 1080p video at a time, saving photos or videos to your devices, sharing them with contacts, or live streaming video.

Meta says the new glasses also feature better audio with new speakers that are up to 50% louder and offer twice as much bass while also offering better support for directional audio, which means you should be able to hear phone calls, music, or other audio better even in noisy environments. The 5 built-in mics can also capture immersive audio when you’re recording videos.

The glasses are powered by Qualcomm’s new Snapdragon AR1 Gen 1 system-on-a-chip, and compared with the first-gen Ray-Ban Stories, the new glasses have more storage (32GB rather than 4GB), a lighter-weight design (133 grams rather than 195 grams), and IPX4 water water resistance. The Ray-Ban Smart Glasses also feature WiFi 6 and Bluetooth 5.3, which is a step up from the WiFi 5/BT 5.0 support for the previous model.

Meta says you should be able to get “up to 36 hours” when using the glasses with a charging case that provides “up to eight additional charges,” suggesting that the glasses themselves should be good for around four hours of usage.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,544 other subscribers

Join the Conversation

17 Comments

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  1. The thing I’m most interested in hearing about is the legal implication of the data that they will inevitably be gathering with these devices.

    It seems pretty reasonable to assume they will have facial recognition software, and I’m curious what limitations they will face with storing that data.

    For example, as a person who has no services or accounts with Meta, if a user looks at me, and the device collects facial recognition data from me, is that data stored anywhere? Is it later cross-referenced with other images held by other Meta services? Like if another user later posts a picture of me on Facebook, does facebook link that earlier facial recognition data to my face in the posted photo?

    I don’t care if a company checks my face to see if it matches facial recognition data that they’re storing on their users. But I do care if they store my data as someone who didn’t register, permit, or consent. I don’t want to live in a world where one day companies like Meta can pinpoint my exact location in the world by having an army of users that allow companies like Meta to use their wearable hardware to check facial recog data on every person they pass in public.

    I know Meta deleted all of their facial recognition data in 2021, but was that perhaps because their Terms of Use policy under which that data was collected was insufficient for this level of use?

  2. It’s kind of a weird device. You have to be comfortable sending a bunch of data to Meta. Since these are wifi/bluetooth only, you either have to have your phone or be in wifi range all the time – so the only advantage over just using a phone is that the glasses are already on your face?

    Since there’s no display, if you’re fixing a faucet, you have to make do with audio instructions only – and I’d be shocked if they process the video quickly enough to tell you “no, you’re tightening the wrong screw; tighten the one on the left instead” while you’re actually working on it. Instead, you get either a GPT answer read aloud to you, or the audio from a YouTube video.

    I’d actually be interested if you could just use the glasses to share what you see with someone else while you’re talking to them. Being able to see my parents’ computer screen while I’m walking them through fixing a problem? That would be great! Or maybe even calling a plumber about the faucet so he can tell what tools or parts to bring out.

    1. Well, yeah, even if you can only stream to Facebook live, you can have someone you know talking to you while watching the stream, although you might need a phone to do that.
      I really think that live streaming from your EXACT perspective is what’s going to sell most of these, more than anything else. Yeah, you can use a go pro for that, but it’ll be slightly off and not as easy to wear. Facebook will probably have to come up with some ridiculously complex and expensive to run means of determining if you’re watching something so they’re not distributing low quality copies of movies and stuff, probably driven by machine learning.

      Of course, just because it’s easy doesn’t mean it’s right to demand or even expect people to feed all that data to Facebook just so you can see what they see!

  3. So… The sunglasses have cameras, loudspeakers, a mini computer, and some wireless connectivity. But there’s no display in the sunglasses? I would have assumed that they had some kind of projection onto the lenses so you could read the “menu translation” just like with Google lens.
    I also assume that those are only for people without prescription glasses…

  4. Only if you can buy it without that FUGLY as hell Ray-Ban logo and for $150 less, at least…

  5. ”While you could theoretically do all of the things mentioned above with a phone”

    Do much people do these things with their phone today? I don’t but I tend to be behind the times. Just wondering if people would get glasses for these things by comparing how much it’s done on phones today.

    1. I’d probably use my phone to translate signs and menus if I ever ran into any I couldn’t read, because, I mean, what else is there, asking for help is considered rude and makes you look stupid and laptops almost never have world-facing cameras you can whip out.
      But looking stuff how to do things on a phone is awful. That’s less the phone’s fault and more the way information on the internet has become organized resulting in terrible experiences on search engines, but nevertheless sifting through all the junk on a small screen is far worse than doing so on a personal computer.

      1. So you never actually used your phone for that? That was the base question. I would if I needed to but I never needed to. I guess the small percentage of phone users who travel internationally might have frequently.

        Now I’d even less likely get glasses to do it for the almost never situation I’d need to. For that menu at a foreign themed restaurant? I’d ask the employees. That’s what the (sometimes forced) tip is for.

        As for doing stuff on your phone being difficult. I’d imagine doing the same thing on these glasses would be worse.

        1. Well, yes, I don’t. I was sort of trying to imply that if I was a respectable and richer person who traveled more like you’re supposed to in order to be considered worth anything, I would use my phone for these things. After all, the waiters would hate my guts for being from where I’m from.

  6. “While you could theoretically do all of the things mentioned above with a phone or other devices, a heads-up display could make everything feel simpler and more natural than sticking a phone or VR headset between your face and the rest of the world.”
    That’s possibly true, but as far as I can tell based on not reading much more than this article, these glasses don’t have one. This means that, if it does take a picture of a menu and translate it, you’ll have to hear the translation read out to you. Unlike a display where you can easily skim through it until you see the areas of interest, a menu being read out will likely take a while to get through the appetizers so you even know what other sections there might be. It’s still potentially useful if the alternative is that you can’t read it at all, but how many customers will want to buy something with that limitation and how many would use it when they could probably perform the same task using their phone and then they can read the translation on the phone’s screen?

  7. When glasses with cameras pointing at my face become the norm, I will start wearing a niqab / burqa. And I say that as an atheist man.

  8. “Useful” to the extent that any sunglasses are useful. Which is to say, indoors, not very. It’d be better if the design defaulted to the Lone Ranger sort of big wrap around frames that fit over typical prescription lenses. Otherwise, the market is limited to young healthy users with sufficiently good vision they don’t need corrective lenses. The last asymmetry is that the camera takes a lot of data IN as video, but the speakers can only respond with a limited, relatively slow, amount of data back as audio. Typical use case on a browser, or anything else, is the opposite — a short quick question that prompts a long and detailed answer. So

  9. Having replaced a few faucets, I can definitely say that whatever generic instructions that thing gives you are going to be totally unreliable. Yeah, it might know you’re looking at a faucet, but it’s not going to give you any information about which mixer valve cartridge you need, or if there’s something you need to unlatch or unscrew before you can get the old one out. And if you can figure that out, you can probably figure out the rest of the steps. Just do yourself a favor and tape the manual for your sink and anything under it to the inside of the sink cabinet door.

    They’ll still sell more of these with actual streaming capacity, although privacy and information security are going to start bothering people a little bit more once someone walks into a theater wearing these things! Just a little bit though ‘Cause after all, no decent person should have any concern about if that obnoxious way you scratch your head gets put up for everyone to laugh at; or if someone just walks up to random people on the street and asks terrifying political questions to try and ruin your life by way of you giving the wrong answer; or that everything you look at is getting datamined by Facebook which hopes for nothing less than to be able to dictate your life to you. It’s only when Hollywood’s profits are threatened does anything like THAT become important.

    1. You’re probably right about that. Live streaming the latest blockbuster straight from your fairly normal looking glasses to the world would probably be much easier than trying to film it from your phone or other type of video camera without anyone seeing what you are doing.