Samsung aims to address concerns regarding its camera processing technology, specifically the Moon photo detection system introduced with the Galaxy S21. This feature, activated through the Scene Optimizer, employs AI to identify when you're capturing a clear image of the Moon at 25X zoom or greater. The system adjusts brightness, captures multiple frames to enhance image quality, and utilizes a neural network for detail enhancement by referencing a high-resolution image.
Users have the option to disable Scene Optimizer, and Samsung clarifies that the feature will not function if the Moon is obscured or if the image is not taken on Earth. The Moon’s position ensures a consistent view of its surface, making it possible for the technology to operate effectively.
This explanation follows accusations from Reddit user Breakphotos, who alleged that Samsung was digitally augmenting Moon images with details not found in the original captures. Breakphotos demonstrated this by comparing blurry, low-resolution images captured on a computer screen that the phone couldn’t realistically enhance. Even in instances of overexposure, the device seemed to add information beyond what was visible.
While Samsung asserts that it uses the original shot as a reference point, the algorithms employed may result in photographs that do not accurately reflect reality. Acknowledging this potential confusion, Samsung states that it is refining Scene Optimizer to clarify the distinction between genuine lunar photographs and those enhanced by its technology.
This situation isn't unique to Samsung; other smartphone manufacturers have faced scrutiny for manipulating photo outputs. Some brands have introduced beauty modes that alter skin and body features, creating unrealistic representations. However, Samsung's claims suggest that its devices can produce technically unattainable images, potentially misleading consumers about the actual photography capabilities of models like the Galaxy S23 Ultra.