Samsung captured fake zoom images of the Moon

For years, Samsung “Space Zoom”-capable phones have been known for their ability to take incredibly detailed pictures of the Moon. But a recent Reddit post showed in stark terms just how much computational processing the company is doing, and — given the evidence provided — it feels like we should go ahead and say it: Samsung’s pictures of the Moon are fake.

But what exactly does “fake” mean in this scenario? It is a difficult question to answer, and one that becomes increasingly important and complex as computational techniques are further integrated into the photographic process. We can safely say that our understanding of what makes a picture False will soon change, as it has in the past, to accommodate digital cameras, Photoshop, Instagram filters, and more. But for now, let’s stick to the matter of Samsung and the Moon.

The test of Samsung’s phones by Reddit user u/ibreakphotos was brilliant in its simplicity. They created a deliberately blurred image of the Moon, displayed it on a computer screen, and then photographed this image using a Samsung S23 Ultra. As you can see below, the first image showed on the screen no details at all, but the resulting image showed a sharp and clear “photograph” of the Moon. The S23 Ultra added details that simply weren’t there before. There was no upscaling of blurry pixels and no retrieval of seemingly lost data. There was just one new moon – a false one.

Here is the blurred image of the Moon that was used:

A GIF of the photo capture process:

And the resulting “photograph”:

This is not a new controversy. People have been asking questions about Samsung’s lunar photography ever since the company unveiled a 100x “Space Zoom” feature in its S20 Ultra in 2020. Some have accused the company of simply copying and pasting pre-stored textures on images of the Moon to produce its photographs, but Samsung says the process is more involved than that.

In 2021, Input Mag published a lengthy feature about the “fake detailed moon shots” taken by the Galaxy S21 Ultra. Samsung told the publication that “no image overlay or texture effects are applied when you take a photo,” but that the company uses AI to detect the Moon’s presence and “then offers a detail-enhancing feature by reducing blur and noise.”

The company later offered a bit more information in this blog post (translated from Korean by Google). But the heart of the explanation—the description of the vital step that takes us from a photograph of a hazy moon to a sharp moon—is treated in obfuscating terms. Samsung simply says that it uses a “detail enhancement engine function” to “effectively remove noise and maximize detail on the moon to complete a bright and clear image of the moon” (emphasis added). What does it mean? We simply don’t know.

A “detail enhancement engine feature” is to blame

The generous interpretation is that Samsung’s process captures blurry details in the original image and then upscales them using AI. This is an established technique that has its problems (see: Xerox copiers change numbers when upscaling fuzzy originals), and I don’t think it would make the resulting photograph fake. But as the Reddit tests show, Samsung’s process is more intrusive than this: it doesn’t just improve the sharpness of blurry details—it creates them. It is at this point that I think most people agree that the resulting image is, for better or worse, false.

The difficulty here is that the concept of “fakeness” is a spectrum rather than a binary. (Like all categories we use to divide the world.) For photography, the standard of “reality” is usually defined by the information received by an optical sensor: the light captured when you take the picture. You can then edit this information quite extensively like professional photographers adjust RAW images and adjust color, exposure, contrast and so on, but the end result is not fake. In this particular case, however, the Moon images taken by Samsung’s phone look less like the result of optical data and more like the product of a computational process. In other words: it is a generated image more than a photo.

Some may not agree with this definition, and that’s fine. Drawing this distinction will also become much more difficult in the future. Ever since smartphone manufacturers began using computational techniques to overcome the limits of smartphones’ tiny camera sensors, the mix of “optically captured” and “software generated” data in their output has changed. We are definitely headed for a future where techniques like Samsung’s “Detail Enhancement Engine” will become more common and used more widely. You can train “detail enhancement engines” on all kinds of data, such as the faces of your family and friends to make sure you never take a bad photo of them, or on famous landmarks to enhance your vacation photos. In time, we will probably forget that we ever called such images fake.

Samsung says “no image overlay or texture effects are applied when you take a photo”

But so far, Samsung’s Moon photos stand out, and I think that’s because it’s a particularly handy program for this kind of computer photography. For starters, Moon photography is visually accessible. The Moon looks more or less the same in all images taken from Earth (ignoring releases and rotational differences), and while it has detail, it lacks depth. That makes AI enhancements relatively straightforward to add. And second: Moon photography is catnip marketing because a) everyone knows phones take bad pictures of the Moon, and b) everyone can test the feature for themselves. This has made it an easy way for Samsung to illustrate the photographic prowess of its phones. Just look at this ad for the S23 Ultra with an 11-second Moon zoom:

It is this viral appeal that has landed the company in trouble. Without properly explaining the feature, Samsung has allowed many people to mistake its AI-enhanced photos for a physics-defying optical zoom that can’t fit into a smartphone. In turn, this has led others to debunk the images (because the tech world loves a scandal). Samsung doesn’t exactly claim that their Moon photos are representative of all its zoom photography, but a consumer would be forgiven for thinking this, so it’s worth emphasizing what’s really going on.

Ultimately, photography is changing, and our understanding of what constitutes a “real photo” will change with it. But for now, it seems reasonable to conclude that Samsung’s Moon photos are more fake than real. Presumably this may no longer last in a few years. Samsung did not immediately respond The edge‘s request for comment, but we’ll update this piece if they get back to us. Meanwhile, if you want to take an unadulterated picture of the Moon using your Samsung device, just turn off the “Scene Optimizer” feature and get ready to take a picture of a blurry circle in the sky.

Leave a Reply

Scroll to Top
%d bloggers like this: