Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

What they don’t tell you about wearing a camera on your face

A person wearing the Ray-Ban Meta smartglasses, taking a photo.
Andy Boxall / Digital Trends

I’ve been wearing the Ray-Ban Meta smart glasses for a few weeks now and, therefore, have lived life with a camera on my face during that time. Being able to capture that cool first-person viewpoint with the camera was one of the things that attracted me to these glasses, and I’ve had fun with it so far. But there’s a lot more to it that I simply hadn’t thought about prior to actually snapping photos.

It turns out that taking photos with the Ray-Ban Meta’s camera is dramatically different from taking photos with your phone in ways that I just didn’t expect. This has unexpectedly forced me to rethink how I take photos.

Your eyes aren’t the viewfinder

A person wearing the Ray-Ban Meta smartglasses.
Andy Boxall / Digital Trends

All it takes to capture a photo or a video with the Ray-Ban Meta is a press of the button on one of the arms. It’s very easy, and much faster than getting your phone out. However, it turns out there’s a huge difference between the photo I think I’m taking with my smart glasses and the photo I actually take.

Exactly what you’ve captured using the camera is a mystery until you import the photo onto your phone, and it turns out my brain is a terrible judge of its ability. The trouble is, there’s no viewfinder on the Ray-Ban Meta, so you sort of just look in the general direction of where you want to take a photo, press the button, and hope for the best. My eyes and overall perception of the scene are not representative of what the Ray-Ban Meta’s camera sees, and learning what the photo will probably look like is very difficult.

We’re conditioned to compose our photographs using the viewfinder or screen on a camera or phone, where it’s obvious what’s going to be in the frame. I’ve found my mind’s eye is very unreliable in composing the shot I want to take when I’m using the Ray-Ban Meta, as the photo almost never comes out as I imagined.

There are many reasons for this. The camera takes photos in portrait orientation, yet I look at things in landscape, and I’m rarely as close as I think I am to the subject for it to correctly fill the frame. Even getting the horizon level is surprisingly difficult. My eyes don’t have their own built-in level, and the slightest subconscious head tilt is emphasized tenfold in the photo.

A completely new way of taking pictures

Understanding what you’re taking a photo of using the Ray-Ban Meta smart glasses is surprisingly hard. I’m never really sure how close I need to get to the subject, whether anything is centralized in the frame, or even if what I want to capture will even be seen. My brain thinks I’m taking the perfect photo, and I’m already thinking about sharing it online to rapturous applause as I press the shutter release. But when I see the final photo later on, it’s not at all how I pictured it.

We look around us and focus on things near and far without really thinking about it. My brain fools me into thinking that the smart glasses are also seeing what I’m seeing and will capture the exact photo I have in mind. Because I have no way of actually seeing if this is true before pressing the button as I do with a phone, I end up taking a lot of photos where it’s unclear what the subject is, despite it being obvious in my mind when I took it. It’s hard to judge how the wide-angle view will treat a scene, leading to the point of interest that looked close in reality actually turning out to be quite small and far away in the photo.

It is quite hard to describe until you use it, but the thing to know is how long it takes to understand where the Ray-Ban Meta’s camera’s strengths lie and how to shoot around its shortcomings. Even when you get better at this, it’s still very hard to visualize exactly what you’re photographing.

What it means is you should take a lot of photos, then prepare to discard a lot of them that aren’t very good.

Be prepared to edit your photos … a lot

The good news is the photo you had in your mind’s eye is still possible to recreate with the Ray-Ban Meta’s camera, but you’ll almost certainly have to spend some time editing the image you take to get there. While we’re all used to applying a filter to our photos or tweaking the saturation or contrast levels, the Ray-Ban Meta’s photos mean getting very comfortable with the crop and straightening tools, too.

Cropping the shot is usually essential as the wide-angle view often includes a lot of scenery that doesn’t add anything to the image. Something I’ve found, which may be unique to me, is to beware of any headwear as well. I wear baseball caps a lot, and the brim is a constant unwanted intruder in Ray-Ban Meta photos that I have to crop out. Photos tend to only need a few degrees of angle adjustment to straighten them out, and although the photos have a lovely tone, some benefit from a few contrast, exposure, and saturation tweaks.

Given how much you need to think about and do to get the most out of these smart glasses may have you wondering if it’s all worth it. I’m happy to say the answer is a resounding yes.

The first-person view is unlike one you’ll get from any other device that’s not attached to your head, and as it’s easy to have your hands in the shot, it’s obvious you aren’t simply taking the photo with a regular phone or camera. But beyond this, the Ray-Ban Meta smart glasses force you to think about your photos in a new way and to take photos when you normally wouldn’t reach for your phone or camera.

Unique, creative, and wonderful

A person wearing the Ray-Ban Meta smartglasses.
Andy Boxall / Digital Trends

The photos you take really can be different, eye-catching, and unique. For example, I rarely take photos inside one of my favorite coffee stops, but I was inspired to take a shot with the Ray-Ban Meta (it’s the final photo in the gallery above) without much expectation for the results. After a few editing tweaks described above, it captured the ambience of the space perfectly, and in a way that I wouldn’t have thought to do with my phone. In fact, I wouldn’t have taken the shot at all if I hadn’t been wearing the smart glasses.

There is an immediacy and a slightly concerning surreptitiousness to using the face-mounted camera, as it would have been very awkward to take the cafe photo with my phone. I don’t intend to use the Ray-Ban Meta camera creepily, but sadly, the potential is somewhat there, even though there’s a glowing light around the camera when you snap a photo. It’s something I’m very aware of when I am using them.

The photos you take really can be different, eye-catching, and unique.

This is counteracted by the fact you’re often going to be a spectacle (no pun intended) when using the Ray-Ban Meta’s camera creatively at times. Remember, you’re not holding a camera; you’re moving your head and body to get a different angle or perspective, and as I explained, you’re going to need to experiment with it as the shot will never quite be what you expect. I am aware I bizarrely stop and look at things, then move on, and no one knows I was taking a photo at the time. When I took the photo of the car above, it looked like I was bowing down to worship it at the time. You will look weird.

What no one tells you about using the Ray-Ban Meta’s camera is how much of an entirely new experience it is. Whether it’s understanding what your photos will actually look like and prompting you to creatively edit far more than usual or coming to terms with looking a bit odd while you take them, there’s no other camera like it. It’s a creative and enjoyable new method of capturing the world around you.

Andy Boxall
Andy is a Senior Writer at Digital Trends, where he concentrates on mobile technology, a subject he has written about for…
How bad is a 2012 iPhone camera in 2023? Take a look for yourself
A person holding the iPhone 5.

I really like the iPhone 15 Pro Max's camera, which performed well in both our review and a recent camera test.

But after discovering an old, long-forgotten Apple iPhone 5 in a drawer at home and finding it was still in perfect working order, I wondered just how different photos taken with it would be compared to those taken by its current equivalents. I decided to find out in one of the more unusual camera tests I’ve ever performed.
Context and cameras
The iPhone 5, iPhone 15 Pro Max, Galaxy S23 Ultra, and OnePlus 11 cameras Andy Boxall / Digital Trends

Read more
I finally figured out why I don’t love the iPhone 14 Pro’s camera
A close-up of the iPhone 14 Pro's camera module.

My iPhone 14 Pro and its camera have been with me for almost a year now, but as I looked for photos to share in this retrospective about the camera’s ability, it became clear I hadn’t used it to capture many truly important moments.

It struck me as odd. After all, it’s a good camera, and the results when I do use it are often superb. Plus, I've loved iPhone cameras in the past, so why don’t I always reach for it every time I want to take a great photo? I think I've finally figured out the reasons why.
When the iPhone 14 Pro gets it right

Read more
No, you really don’t need Google Assistant on your smartwatch
Google Assistant listening on the Google Pixel Watch.

The Mobvoi TicWatch Pro 5 doesn’t have Google Assistant built-in, and you can’t separately download and install the app from the Google Play Store. It’s the latest in a line of Android smartwatches that don’t have Assistant onboard, following on from the Montblanc Summit 3 and most modern Fossil smartwatches, but it’s still a standard feature on Google’s own Pixel Watch.

Is Google holding Assistant back for its own devices? Maybe, but I’m not going to worry about it, and I definitely don’t think you should pick the Pixel Watch over the TicWatch Pro 5 due to it. Why? The Assistant on a smartwatch isn’t the selling point Google seems to think it is.
Is it needed on a smartwatch?

Read more