HomeArticle

Malfunction right after release? The camera of the iPhone 17 is malfunctioning. Such failures have become the norm in the industry.

雷科技2025-09-20 09:16
The flaw cannot eclipse the jade.

One of the advantages of buying an iPhone is that as an international brand, once something goes wrong, people around the world will be watching, and everyone will rush to report it.

Sure enough, as soon as I got the real phone in hand, it turned out to be a fiasco...

After all, it was the launch day of the new product. Just waking up in the morning, I opened social media, and sure enough, the iPhone 17 dominated the lists. Lei Technology also sent people early to queue at the Tianhuan Apple Store to get the product at the first time and produce content for everyone. Don't change the channel if you're interested, guys.

Interestingly, amidst the voices of people happily getting the new product and angrily scolding scalpers for raising prices, there were some different voices mixed in.

Maybe the Indian programmers at Apple knew that we were approaching a festival. Before the new phones in the hands of various review experts had even warmed up, they found that the "annual flagship" they had spent a fortune on actually had a big problem with its camera, resulting in a rather funny bug.

(Image source: CNN)

You're not seeing things. This is a black block surrounded by dozens of irregular white curves.

This isn't some stage setup for a concert. According to the tech media MacRumors, Apple has admitted that there are camera issues with the iPhone 17 series and the iPhone Air. When directly exposed to extremely bright LED lights, black squares and white curves may appear in the photos taken. Currently, this problem can be reproduced stably, so it's hard to call it an occasional issue.

All of a sudden, there were a lot of complaints.

Did the iPhone 17 have bugs right after its release?

Here's what happened.

Recently, Henry Casey, a review editor from CNN Underscored, published an early review article of the iPhone Air. In addition to affirming the product's performance and quality, this guy also emphasized his dissatisfaction with the product's external speakers and imaging. The most important sentence is here:

"I also noticed a strange imaging problem at this concert. On the iPhone Air and the iPhone 17 Pro Max, about 1 in every 10 photos taken had a small part blacked out, including parts of the white wavy lines on the boxes and the large LED board behind the band."

(Image source: CNN)

Well, if this guy hadn't mentioned it, I would have thought this faulty-looking stage setup was quite interesting...

Of course, joking aside, I'm actually quite curious about what exactly caused this bug. You know, for many users, concerts and live events are one of the most important shooting scenarios for smartphones. Not being able to shoot live performances would be a disaster.

Apple's explanation was rather vague, only saying that direct exposure of the camera to extremely bright LED screens would trigger this situation.

Here's what people on the Internet are saying. Nowadays, all phones emphasize "computational photography," and Apple is one of the leaders in this area. Especially in a concert where there's a large contrast between the pitch - black areas and the extremely bright areas, in order to let you see both the details in the dark and the outlines in the bright areas clearly, the system will definitely automatically turn on the HDR mode.

After turning on HDR, the iPhone will take several photos with different exposure levels in a fraction of a second at a very high speed. For example, one photo is severely underexposed to ensure that the details of the stage lights and screens aren't blurred; one is normally exposed to take care of the mid - tones; and another is severely overexposed to capture the details in the dark corners.

(Image source: X)

Next, the ISP and NPU will "stitch" these photos together to form a final photo full of details that you see.

The problem lies with the LED screens at concerts. They don't stay lit continuously but flicker at an extremely high frequency that the human eye can't detect (i.e., PWM dimming). When the iPhone takes photos in high - speed continuous shooting mode, one photo may capture the moment when the LED is on, so it's overexposed; the next photo may capture the moment when it's off, resulting in complete darkness.

When the algorithm faces two photos with such conflicting information in the same position, it gets completely confused and can only give up, randomly outputting a bunch of corrupted, overflowed, or undefined data.

What finally appears in the photo are those strange black boxes, white lines, and noise.

Of course, this is just what netizens are saying, but one thing is certain: it's not that your lens glass is broken, nor that the CMOS sensor is burned by a laser. At most, it's just a small bug in the software algorithm.

As for Apple... anyway, it said that it plans to fix the problem through a subsequent software update, but the specific time for pushing the patch hasn't been announced yet.

When product failures become the norm in the mobile phone industry

Seeing this, some readers may already be fuming:

"This phone costs over ten thousand yuan, and it can still have such seemingly low - level software errors? Tim Cook, you old guy, only know how to make money!"

My opinion is, don't be in a hurry.

This kind of "fiasco" of computational photography in specific scenarios isn't anything new in the entire mobile phone industry. Apple isn't the first to encounter this problem, and it definitely won't be the last.

Here, let's take the once - sensational "P moon mode" controversy as an example.

(Image source: X)

A few years ago, all phone manufacturers loved to promote the "super moon" mode. At that time, almost all of them claimed that their phones could take clear photos of the moon with visible craters. And even amateur photographers found that it was really the case, which made people exclaim about the "technology and tricks."

As a result, a meticulous netizen conducted an experiment: he put a blurry picture of the moon on a computer screen and then used a phone to take a photo of the screen. The phone still "took" a high - definition photo of the moon. Some even found that as long as they found a round glowing object at night, these phones could all take a clear photo of the moon.

This incident finally ended with some manufacturers adding a function switch to the camera.

Let's also talk about the "HDR ghosting" that almost everyone has encountered.

(Image source: Google)

This phenomenon is more common than the "P moon" issue and better reflects the limitations of algorithms when dealing with dynamic scenes. When you turn on the HDR mode and take a photo of a moving object, such as a running pet or a waving friend, and then zoom in on the photo, you're likely to find a semi - transparent "ghost" or double image at the edge of the object.

The process of this "ghost" appearing is similar to the bug in the iPhone 17. As mentioned before, HDR requires taking multiple photos continuously for synthesis. If your subject moves during this fraction of a second, when the algorithm tries to align and merge these two photos, this phenomenon may occur.

To solve this problem, Google designed a new spatial merging algorithm. This algorithm decides whether each pixel should merge the image content and was pushed to the Pixel series products with this problem through a subsequent update.

Yes, just like what Apple is going to do now.

Conclusion: Minor issues, the iPhone 17 is still worth looking forward to

Did the iPhone 17 series have a fiasco?

My opinion is that there are indeed some bugs, but it's not a big deal.

Ultimately, this situation happened because the physical laws firmly limit the upper limit of mobile phone imaging, making it inevitable to rely on algorithms. While computational photography has become an ideal solution to these physical bottlenecks, it has also brought a series of problems that are difficult to predict in advance.

(Image source: Lei Technology)

If you're one of the first - batch iPhone 17 users, don't get angry right away when you see this, and don't rush to the after - sales service to argue.

First of all, based on our shooting test with the iPhone 17 Pro Max against a high - brightness MiniLED screen, it's basically very difficult to reproduce this situation in real life. We spent five minutes taking photos back and forth, and not a single photo had this problem. It probably only happens in a complex environment like a concert.

Secondly, Apple's OTA update speed is quite fast. It's very likely that the next update will fix this problem. After all, this isn't a problem with the hardware design. Maybe the problem will be fixed before you even encounter it.

There's really no need to overreact to the patch.

This kind of situation is very common in the industry. When the Google Pixel 6 was first released, the under - display fingerprint recognition was as slow as a sloth, but the recognition algorithm was quickly optimized through a software update. Domestic flagship models also often push updates quickly after their release to fine - tune the camera's color science and focusing logic based on the reviews and feedback from the first - batch media and users.

The existence of "patches" proves that the manufacturers are constantly striving to improve their products.

Of course, it would be better if these optimizations could be completed before the product is officially launched.