What you get IS NOT what you see

A friend of mine was in a terrible road accident about three months ago and has scarring on her face. She showed me a recent photograph of herself taken on her mobile phone.

‘Do I really look like that?’ she asked.

The answer was ‘no’ and the reason why reminded me that, even though photography is going through a golden age of accessibility thanks to mobile phones, people still don’t understand that things in the camera don’t look the same way they do through your eyes. Yet, such is our touching faith in technology, we imagine the photo represents the truth more than what we see in the mirror or even with our own eyes.

Well, it ain’t so and here’s why.

It’s primarily down to the type of lens used and the processing that goes on in the background.

By the way, whenever I say ‘camera’ in this piece, I mean either the camera in a phone or a top-range dslr and anything inbetween.


Let’s start by disproving a commonly held belief:

The camera records what I see accurately.

It doesn’t. There’s no end of image tweaking that goes on in modern cameras (or phones) BY DEFAULT.

The default is Vibrant

My new phone is a Motorola G6 Play. In the initial setup for the phone, one of the questions was ‘Do you want to keep the display settings at their default, Vivid? Or do you want to set them to Standard? Meaning less ‘poppy’ but more natural. The first thing I do when buying a modern camera is to turn off all the automatic processing that manufacturers build into it. I don’t want the shadows lightened, for example: I frame with shadow, as in this example:

Dark Waterfall from the Out and About collection

The default settings on any modern camera, whether a top-end DSLR or your humble camera phone, are devised to give you an image which makes everything visible, especially areas in deep shadow or very bright light.

Manufacturers used to have lots of differing names for it but the principal is the same: lighten shadows to expose more detail, tone down overbright areas to do the same. This results in an even exposure across all areas, light and dark.

For the majority of people, this is a desirable thing and is why you rarely see a ‘bad’ photo these days. Under or over-exposed shots are a rarity thanks to this kind of processing. Even blurring is less common thanks to impressive camera shake reduction technology.

Now, you may be thinking ‘he’s talking about HDR’ – and if you don’t know what HDR is, you can find an excellent description of what it is here.
I’m not talking about HDR, however. I’m talking about the default processing that goes on in your camera or – especially – your phone processor, before you even engage the HDR setting.

The truth is, however, that these default settings don’t convey an image accurately. Along with lightening shadows and toning down bright areas to expose more detail, they:

  1. Increase contrast (minimising mid-tones to increase the clarity between light and dark areas, disastrous when applied to delicate shifts in skin tone);
  2. Increase saturation (pumping up colour) and;
  3. Increase sharpness (increasing the definition of the entire shot in an aim to add detail, changing the consistency of smooth areas to that of a gravel driveway. All that moisturising for nothing!).

For a landscape, or a group photo of people, these default effects are fine but for a selfie or portrait shot, such processing tends to ‘harden’ people’s faces and make them look spray-tanned brown or lobster red. The thing is, with mobiles, you mostly cannot turn these settings off. This is why phone manufacturers were quick to add a variety of filters to compensate.

Ever taken a selfie that makes your eyes bigger, smooths your skin and adds floral headbands and hearts flying around your head? Judging by my Facebook and Instagram feeds most of you have. Well stop it. You’re a person, not an elf or an anime character. Although that specific filter is at the extreme end of the spectrum, with the ‘rise of the selfie’ (the working title for the next Star Wars film) phone manufacturers have added dozens of ‘portrait’ filters to compensate for the unflattering effect of their default sensor settings.

So let’s return to my friends question: ‘Do I really look like that?’

Clearly the answer would be, ‘no’ but there’s another, more compelling reason, why portraits and selfies taken on a mobile are inaccurate: the lens.


What does the lens do? Without wishing to state the bleedin’ obvious, more than anything else on a camera, the lens affects what you see. Different types of lenses capture the image in different ways and are best suited to different types of photography.

Sports photography requires a zoom/telephoto lens to enable you to get close to the subject from far away while close-up nature photography would require a macro lens, enabling you to be millimetres away from your subject while still being able to focus.

Skier by Johannes Waibel on Unsplash Leaf by gil on Unsplash

You can take a photo with any lens but some are better suited to certain tasks than others.
Let’s avoid getting too technical on this because with photography it’s all too easy to do so, so bear in mind I’m;

  1. talking about smartphone lenses vs camera lenses and;
  2. Talking about what lens gives you the most flattering portrait shot.


To understand what I mean let’s take a look at a range of photographs of a landscape below. For these photographs, the camera would have been mounted on a tripod and pointed towards a fixed view. The position of the camera and the direction it is pointing remain unchanged.

Nine different lenses were used to take each photograph, each lens being a different focal length, from an 18mm lens up to a 300mm lens. In the image, you can see the difference changing the lens has on the photo you get.

Comparison photo from the Campbell Cameras website

Now let’s look at what those different lenses do when pointed at a face.  Here’s a good example of how different focal lengths change the shape of a face from a lighting tutorial on the Pro Video Coalition website.

Image from Pro Video Coalition

As a general rule of thumb, when it comes to portrait photography, most photographers would choose a lens in the 85mm-300mm range as it makes the face look closer to how it does in real life.
Anything below 50mm begins to distort an object close to the camera.


An article on the Techspot website about smartphone camera hardware makes this statement:

’All smartphones fall into the wide-angle lens bracket, typically somewhere around 24-30mm.’

And that, in a nutshell, is why they are not good for portraiture.

However, rules are made to be broken. Look at this example by photographer Zulmaury Saavedra. It’s an interesting portrait that uses the distortion caused by the wide angle lens for creative effect but it is certainly not a realistic interpretation of what the model looks like.

Photo by Zulmaury Saavedra on Unsplash

It’s really hard to find an example of a bad portrait caused by using the wrong lens because no photographer wants to put bad portrait photos online!

I hope you are closer to understanding that a camera on a phone produces poorer portrait shots because the lens is not the ideal type as it distorts features. Also, the default processing in the camera is unflattering for portraiture.

As with many things, it’s a question of the right tool for the right job.

If you found this article interesting, here are links to some other websites that you might like, too.

How to Choose the Perfect Portrait Lens

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to top