Apple Vision Pro is not finished.
At the Apple Developer Center in Shanghai, less than ten minutes into the workshop, the first serious topic of discussion was voltage.
An engineer from Blackmagic held up a battery and posed a question: Can the step - up module, which boosts the voltage from 18 volts to over 24 volts, ensure stable recording?
There was a moment of silence in the room.
The answer is no. The Blackmagic URSA Cine Immersive camera, which costs over 200,000 yuan, requires a continuous power supply of over 112 watts to operate properly. After voltage boosting, the power supply can only reach a little over 90 watts. Although it may seem to be working, when recording at a high specification of 90 frames, there is a high possibility of frame loss in the footage.
Blackmagic URSA Cine Immersive camera
This is a problem that only arises when filming high - spec immersive content, and so far, not many people in the industry are aware of it.
A year ago, this problem didn't exist. At that time, the Blackmagic URSA Cine Immersive camera had just started shipping.
To date, the Apple Vision Pro remains the only device on Earth that can perfectly display this kind of content.
Most of the dozens of teams on - site have already used the Apple Vision Pro, filmed content for this device, and some have even delivered projects. They came here with only one thing in mind: How can they create high - spec immersive content?
However, they are faced with numerous specific problems, and no one knows the standard answers.
From engineering problems to film and television problems
In the past, most of the developers received by this developer center were those engaged in programming, design, and game development. It's the first time that content creators have gathered here in such large numbers.
The threshold for creating high - spec immersive videos is incredibly high.
At that time, there were no dedicated cameras, nor were there any editing software capable of handling this format. All that was available was a format specification provided by Apple.
White paper "Apple Movie Profiles for Spatial and Immersive Media"
To go from shooting to delivery, one either has to write code themselves or modify existing tools at every step. The pitfalls that one project team encounters, the next team has to experience all over again. In the early creative teams in this field, it's often engineers, rather than photographers, who hold the camera.
In December 2024, Blackmagic launched the first camera specifically designed for immersive videos. However, it wasn't until a year later, when DaVinci Resolve 20.1 was released, that the immersive video workflow was fully supported for the first time.
RAW files can be directly imported into DaVinci Resolve for editing, color grading, and spatial audio mixing, with metadata retained throughout the process. Finally, the entire workflow can run without relying on code.
New obstacles followed closely.
After just 16 minutes of shooting, the raw footage exceeded 1.2TB, which means that the storage and transmission solutions have to be redesigned from scratch.
Monitoring is even more troublesome. 2D monitors can't display the real binocular depth of field, and the alignment error between the left and right eyes can't be seen on a flat screen at all. By the time it's discovered in the post - production stage, re - shooting is usually the only option. Some photographers say that the wide - angle preview is almost "misleading." Only by viewing it in real - time on the headset can one know what was actually captured.
The logic of camera positions is also completely different. There is no zoom in spatial videos. For a concert shoot, one might need to set up twenty or thirty fixed - focus cameras simultaneously, and the editor will select the shots in the post - production stage. When should the camera switch? Where should it switch to? Where is the audience's attention?
Many experiences from traditional videos don't work here.
The immersive short film "Open Hearts" by The Weeknd only had 30 cuts, while the same song filmed as a regular MV would have 300 to 400 cuts.
When the audience can turn their heads and scan the entire space on their own, the fast - edited hero shots lose their meaning. Part of the initiative in content creation still lies in the hands of the director, but most of it has been returned to the audience.
What exactly do the audience want to watch?
We contacted an XR experience store in Guangzhou. The store owner, Jeffrey, is also a senior media person in the XR field. He told ifanr that in just over a month, they had become the number one in group - buying in this area, all thanks to an Apple Vision Pro.
CORTIS is a K - pop boy group that gained popularity in the second half of 2025. In the highly industrialized South Korean entertainment industry, shooting immersive videos of idols at close range is a common marketing method. CORTIS chose to use the Apple Vision Pro, obviously attracted by its excellent clarity and immersion.
CORTIS
On January 30, 2026, the immersive video "NEAREST: CORTIS" of CORTIS practicing dance was launched for free on Apple TV. With its highly immersive performance and immersive audio - visual effects, it received excellent reviews and was widely recommended among fans. However, only a very small number of people can actually watch this video.
The experience stores with the Apple Vision Pro as the core device have got a chance to break through the niche market.
Generally, the radiation radius of such experience stores is 5 kilometers. However, as fans' recommendation notes spread spontaneously on Xiaohongshu, audiences who dream of having a close encounter with their idols flocked from all directions.
Jeffrey said that customers think the content is so good that they can ignore all the device's shortcomings - the pressure on the face, the heat, and the messy hair... They don't mind at all. Over the past years, it has been extremely difficult for VR content to break through and be promoted. However, the immersive videos on the Apple Vision Pro have broken this dilemma.
Content excerpted from YouTuber "pppppatti"
What the immersive video presents is the real parallax of the real world. The members are standing right next to you. When you look at them, it's as if you're really in the dance practice room. Countless three - dimensional details come rushing at you, creating an unprecedented sense of realism, so real that it's like a dream.
Apple has a set of standards for evaluating the picture quality of immersive videos. The most important criterion is not resolution or pixel density (PPI), but "perceived clarity" - simply put, it's like an eye chart.
Apple hopes that every high - quality immersive video available on the Apple Vision Pro can reproduce the experience of viewing the world with the naked eye - the content in the foreground is rich in details, while the background is naturally a bit blurry, but their frame rate, color, and brightness are as close as possible to the real visual experience.
Only in this way can the excellent sense of immersion be guaranteed.
Therefore, when we put on the Apple Vision Pro, we can find ourselves on a cable car at an altitude of ten thousand meters, at the NBA All - Star game, or in front of a Spanish bullfight - these dream - like scenes that are difficult to experience in person are what the audience really wants to see, a truly scarce experience.
Excerpted from the Apple official website
Dream - making, a scarce experience in the AI era
According to IDC's prediction, in 2025, the shipment volume of the Apple Vision Pro was only about 100,000 units. In the two years since its launch, only a little over 500,000 units have been sold. This might be Apple's most disappointing product since the iPhone was launched.
It's too heavy, too expensive, and too ahead of its time, which are the reasons for condemning the Apple Vision Pro.
However, Apple has never stopped investing in the content for the Apple Vision Pro.
The first episode "Backcountry Skiing" of "World of Red Bull"
In the past two years, more than 30 immersive videos have been launched on Apple TV, and there's always something new to watch every one or two months.
From baseball fields to basketball courts, from the dinosaur era to the animal kingdom, from World War II submarines to high - wire acts, from boy group dance practice rooms to concert venues... It's hard to imagine that Apple would continuously and stably update high - cost content for a device whose shipment volume has dropped by 90%.
Obviously, Apple has never thought that the Apple Vision Pro is a failure. Instead, Apple is betting on the future of the Apple Vision Pro - betting on the dream - making experience that will surely become even scarcer in the future.
In the past two years of using the Apple Vision Pro, I often have this kind of feeling - Is what I'm wearing on my head the future iPhone, Mac, or iPad?
The Apple Vision Pro is an un - "tailored" product. Apple doesn't know what users want, so it gives users everything.
The un - tailored Apple Vision Pro is equipped with too many functions and has to make too many compromises. However, at least, the dream - making experience that makes it hard to distinguish between reality and illusion is an irreplaceable part of the Apple Vision Pro.
Three years ago, the Apple Vision Pro and ChatGPT represented the most eye - catching technological directions in the tech industry. However, the former is a geek toy with 500,000 units sold, while the latter has changed the lifestyles of one billion people.
Therefore, when generative AI lowers the threshold for content creation to the minimum, we will also enter the era of the most saturated digital content.
At that time, what exactly do the audience want to watch?
In 1984, William Gibson described a new medium - "simstim" - in his novel "Neuromancer."
It involves