By T.J. Thomson
Rumours and conspiracies have been swirling following the abdominal surgery and long recovery period of Catherine, Princess of Wales, earlier this year. They intensified on Monday when Kensington Palace released a photo of the princess with her three children.
The photo had clear signs of tampering, and international wire services withdrew the image amid concerns around manipulation. The princess later apologised for any confusion and said she had “experimented with editing” as many amateur photographers do.
Image editing is extremely common these days, and not all of it is for nefarious purposes. However, in an age of rampant misinformation, how can we stay vigilant around suspicious images?
What happened with the royal photo?
A close look reveals at least eight inconsistencies with the image.
Two of these relate to unnatural blur. Catherine’s right hand is unnaturally blurred, even though her left hand is sharp and at the same distance from the camera. The left side of Catherine’s hair is also unnaturally blurred, while the right side of her hair is sharp.
These types of edits are usually made with a blur tool that softens pixels. It is often used to make the background of an image less distracting or to smooth rough patches of texture.
Five of the edits appear to use the “clone stamp” tool. This is a Photoshop tool that takes part of the same or a different image and “stamps” it onto another part.
You can see this with the repeated pattern on Louis’s (on the left) sweater and the tile on the ground. You can also see it with the step behind Louis’s legs and on Charlotte’s hair and sleeve. The zipper on Catherine’s jacket also doesn’t line up.
The most charitable interpretation is that the princess was trying to remove distracting or unflattering elements. But the artefacts could also point to multiple images being blended together. This could either be to try to show the best version of each person (for example, with a smiling face and open eyes), or for another purpose.
How common are image edits?
Image editing is increasingly common as both photography and editing are increasingly becoming more automated.
This sometimes happens without you even knowing.
Take HDR (high dynamic range) images, for example. Point your iPhone or equivalent at a beautiful sunset and watch it capture the scene from the brightest highlights to the darkest shadows. What happens here is your camera makes multiple images and automatically stitches them together to make an image with a wider range of contrast.
While face-smoothing or teeth-whitening filters are nothing new, some smartphone camera apps apply them without being prompted. Newer technology like Google’s “Best Take” feature can even combine the best attributes of multiple images to ensure everyone’s eyes are open and faces are smiling in group shots.
On social media, it seems everyone tries to show themselves in their best light, which is partially why so few of the photos on our camera rolls make it onto our social media feeds. It is also why we often edit our photos to show our best sides.
But in other contexts, such as press photography, the rules are much stricter. The Associated Press, for example, bans all edits beyond simple crops, colour adjustments, and “minor adjustments” that “restore the authentic nature of the photograph”.
Professional photojournalists haven’t always gotten it right, though. While the majority of lens-based news workers adhere to ethical guidelines like those published by the National Press Photographers Association, others have let deadline pressures, competition and the desire for exceptional imagery cloud their judgement.
One such example was in 2017, when British photojournalist Souvid Datta admitted to visually plagiarising another photographer’s work within his own composition.
Concerns around false or misleading visual information are at an all-time high, given advances in generative artificial intelligence (AI). In fact, this year the World Economic Forum named the risk of misinformation and disinformation as the world’s greatest short-term threat. It placed this above armed conflict and natural disasters.
What to do if you’re unsure about an image you’ve found online
It can be hard to keep up with the more than 3 billion photos that are shared each day.
But, for the ones that matter, we owe it to ourselves to slow down, zoom in and ask ourselves a few simple questions:
1. Who made or shared the image? This can give clues about reliability and the purpose of making or sharing the image.
2. What’s the evidence? Can you find another version of the image, for example, using a reverse-image search engine?
3. What do trusted sources say? Consult resources like AAP FactCheck or AFP Fact Check to see if authoritative sources have already weighed in.
T.J. Thomson is a Senior Lecturer in Visual Communication and Digital Media at Australia-based RMIT University.
The Conversation arose out of deep-seated concerns for the fading quality of our public discourse and recognition of the vital role that academic experts could play in the public arena. Information has always been essential to democracy. It’s a societal good, like clean water. But many now find it difficult to put their trust in the media and experts who have spent years researching a topic. Instead, they listen to those who have the loudest voices. Those uninformed views are amplified by social media networks that reward those who spark outrage instead of insight or thoughtful discussion. The Conversation seeks to be part of the solution to this problem, to raise up the voices of true experts and to make their knowledge available to everyone. The Conversation publishes nightly at 9 p.m. on FlaglerLive.
sTiCkMaN hOoK pLaY says
This article really sheds light on the extent of photo manipulation in today’s media. It’s eye-opening to realize how often we encounter doctored images, making it difficult to discern reality. Kate Middleton’s case is just one of many. It raises important questions about authenticity and the impact of such images on our perception of public figures and beauty standards. Thanks for bringing this issue to light!