How many people play with their phones in the middle of the night and end up being blinded by HDR photos?
If you've been surfing the internet a lot recently, say on Xiaohongshu or Weibo, chances are you'll end up like Tony, clicking on a post's picture in the middle of the night under the covers, only to be suddenly blinded by the overly bright screen...
Often, it's just the picture that's extremely bright, while the rest of the screen remains normal...
Well, although it's a bit of a shock to be suddenly flashed, the picture does look clear and sharp, and its quality is several levels higher than before.
The reason you're unexpectedly hit by a "flashbang" is that major smartphone manufacturers and social media platforms have recently been working hard to adapt to something called HDR images.
You may have come across HDR a lot, but what many people don't know is that 2025 can be considered the "first year" for social platforms to truly enable different phones to view and share HDR images.
Let me give you a very straightforward example. Two or three years ago, if you sent an HDR photo to a device that didn't support the HDR format, your photo might end up looking like this... completely messed up.
Left, the original image; right, how it looks on the recipient's device
Come on, if it doesn't support it, fine. But not even allowing you to see it? What's that all about...
To understand why HDR has such issues, we have to start from the beginning.
There are generally two standards for displaying photos: SDR and HDR. SDR (Standard Dynamic Range) is the old format.
Early digital images were usually in the SDR format. It can even be traced back to the film era. Generally speaking, as long as it could reach 8-bit color depth and 100 nits of brightness, it was okay.
Actually, these parameters seem pretty poor nowadays. But why didn't people notice before? Because the display devices were also quite bad. As long as there was a screen, it was fine. It was still viewable...
Moreover, it was more popular to print photos back then, rather than viewing them on a computer.
But that doesn't mean HDR didn't exist. If you really want to dig deeper, the earliest case dates back to 1850. A French photographer, Gustave, overcame the limitations of film dynamic range through two separate exposure processes.
There are generally two ways for separate exposure. One is to split areas with large brightness differences and take two separate shots. The other is to take shots of the bright and dark parts of the same scene respectively, and then overlay them on the same photo paper to form an image with complete details in both the bright and dark areas.
That is to say, over a hundred years ago, photographers had already discovered that by taking multiple photos of the same position and combining them, they could break through the dynamic range limitations of the film itself.
In the digital age, research on HDR can be traced back to 1997. A professor at the University of Southern California proposed relevant technologies for HDR images.
A few years later, some professional cameras did support a function called "bracketed exposure". Simply put, it means taking several photos of the same scene with different exposure levels, which laid the foundation for the development of HDR technology.
However, this function is usually for professional camera users and needs to be used in conjunction with the HDR synthesis function in Adobe Photoshop.
In short, after a long period of exploration, HDR has finally emerged from the era of wild development and started to gain a foothold in the professional field.
Actually, up to this point, HDR was still mainly for professional users. The real hero that brought HDR from the professional niche to the general public is the progress of smartphones.
However, smartphones are both the promoters of HDR photos and the initiators of the chaotic situation...
The iPhone 12 series was the first to master the skills of taking HDR photos proficiently. At that time, Apple fully supported HDR photos, including shooting and viewing.
But if you look into the camera settings at that time, you'll find that Apple needed to use the HEIF format to shoot HDR photos, rather than the common JPG format.
One reason is that Apple uses a technology called Gain Map, which can record the brightness difference data between HDR and SDR.
When HDR content is displayed on an SDR device, the system only reads the basic image. On an HDR device, the gain map will make the hardware adjust the brightness to display the HDR content.
The HEIF format is more flexible. It can directly embed Apple's dedicated Gain Map and store various wonderful data such as depth information and live photos.
HDR seems to be going smoothly on the Apple side, but on the Android side... Well, I can only describe it as "everyone for themselves".
The main reason is that both the hardware and software on the Android side were late in supporting HDR.
For example, Apple has supported the HEIF format since the iPhone 7, and the A10 chip also supports HEIF hardware decoding. On the Android side, Qualcomm didn't support HEIF hardware decoding until the Snapdragon 855, which is two years later than Apple in terms of hardware alone.
Not only is the hardware lacking, but due to the huge differences in system versions and configurations between new and old Android devices, it's also very difficult to promote HDR at the software level.
Of course, another very important point is that Google itself has been slow to come up with a unified standard to regulate HDR photos on Android.
At this time, domestic brands stepped up. Since Google wasn't doing it, they decided to do their own research...
For example, the green brand mentioned at the beginning is a good example. Tony especially remembers that back then, the Find X3 could take extremely impressive HDR photos because it supported the HEIF format.
The blue brand also followed this path. At first, the X80 Pro used its own XDR Photo technology.
Although the HDR effects of each brand are good, they can only be enjoyed within their own system photo albums. Third - party apps still only see ordinary photos, and sometimes the photos sent out can even be messed up...
It's not just the blue and green brands. Huawei, Xiaomi, and Honor have also been making great efforts in the imaging field, resulting in a situation in China in the past two years where everyone is "doing their own thing" in terms of HDR.
And don't forget, besides Android manufacturers, Apple has been going its own way...
So now you should understand why I said HDR was such a mess at the beginning...
Finally, at the Google I/O Developer Conference in 2023, when Android 14 was announced, Google introduced a new feature called Ultra HDR.
More importantly, Ultra HDR is the official standard for the Android 14 system. At this point, there is finally a small degree of unity in the Android camp for HDR.
Subsequently, OPPO and vivo also chose to follow or be compatible with Google's Ultra HDR standard, aiming to be in line with the native Android system.
Moreover, the process of HDR unification continues. For example, starting from the iPhone 16, Apple supports shooting HDR photos in the JPG format, so at least the format is universal for sharing.
Not only has the compatibility been greatly improved, but social media platforms like Xiaohongshu are also actively adapting to the technology with manufacturers. Now, HDR is neither device - nor platform - specific, and it's easier for people to share and enjoy HDR content.
So many people, including Tony, feel that HDR has suddenly become popular. Actually, with the establishment of standards and the development of technology, HDR has started to rapidly move from the professional field to the consumer - level.
In addition, the performance and screen quality of mobile phones have been getting better recently, and social platforms are actively adapting.