Tesla's in - car infotainment system has introduced the Unreal Engine, and its intelligent driving interface is comparable to that of AAA - rated games. But does it really matter?
While Tesla's FSD/Autopilot is in a race with domestic intelligent driving solutions, a new form of competition has emerged.
The whistleblower @greentheonly recently noticed that in the 2025.20 software version update, Tesla introduced binary files related to the Unreal Engine. According to this blogger, the visualization system based on the Unreal Engine can now be activated on Model X and Model S vehicles equipped with AMD chips. The 3D models support interactions such as zooming and sliding. Its function is similar to the 3D model desktop in the intelligent cockpits of domestic self - owned brands. It allows direct interaction and provides a more intuitive view of the vehicle's status.
@TeslaNewswire revealed that in the future, Tesla will also introduce this Unreal Engine into the in - vehicle system to enhance the visual effects of AutoPilot or FSD. When combined with the SR screen on the vehicle side (the real - time image presented after the vehicle collects surrounding information), it will present a more detailed and realistic environmental image.
(Image from X user @IanSamyth)
This blogger also specially created a series of forward - looking pictures, which look no different from a highway racing game.
As for whether Tesla will really use the Unreal Engine to create a vehicle - side SR preview comparable to the graphics of PC games, it's hard to say at present. In fact, many people think that the vehicle - side SR is "dispensable". Automobile companies have been using the Unreal Engine to create fancy features for a long time. So far, apart from creating a relatively delicate and advanced 3D model desktop and complex interactive visual effects, the improvement in practicality is negligible.
However, since Tesla has entered the game, Dianchetong still hopes that this automobile company can use the Unreal Engine to create some good features and inspire the industry.
From "Bonus" to "Must - Have", 3D Elements Have Become a Necessity for In - Vehicle Systems
To be precise, Tesla is not the first company to incorporate the Unreal Engine into the cockpit system. In fact, the 3D HMI based on the physics engine had already become the mainstream of automotive intelligent cockpits before Tesla.
In November 2021, Human Horizons (HiPhi) announced a strategic cooperation at the corporate level with Epic Games. The result of this cooperation was implemented in the HiPhi Z launched in 2022. With the help of the mature visualization material system and perfect toolchain of the UNREAL Unreal Engine, it created the first "true 3D digital cockpit" in China.
Of course, HiPhi is not the only automobile brand that has taken a fancy to the Unreal Engine.
The world's first cockpit system built based on the UNREAL Unreal Engine was created by Lotus Cars (formerly Lotus). It's called LOTUS HYPER OS and was first launched on the ELECTRE. Not only the delicate 3D modeling scenes, but also the entire UI of the Lotus system is built based on the Unreal Engine. The advanced and realistic PBR material system and HDR ambient light illumination technology are applied to most of the interactive interfaces. You might think it's a new - car CG animation from the manufacturer, but in fact, it's a 3D scene rendered in real - time in the cockpit system.
(Image from Lotus Cars)
Even the animated image of XPeng's voice assistant is presented through a 3D model.
To support the performance requirements of this cockpit system, Lotus Cars installed two Qualcomm Snapdragon 8155 chips in the ELECTRE. This configuration was considered luxurious at that time. Both Geely Galaxy E8 and MG Cyberster have achieved 3D real - time rendering by combining with the Unreal Engine. In the words of HiPhi: "Through real - time rendering technology based on physics, it simulates the process of human perception and reconstruction, and constructs a freely interactive 3D space."
Mercedes - Benz's MB.OS also had a visual upgrade with the help of the Unity 3D engine. When you zoom in on the map interface, you can see detailed building models, and the building facades have a high - end metallic reflection.
According to Mercedes - Benz's official statement, designers used a complex 3D visual software package called "Unity Industrial Collection" to 3D - ize elements such as the dashboard, infotainment, and navigation in the digital cockpit. In addition, intelligent electric vehicle manufacturers such as Xiaomi and all NIO models have established their own 3D desktop systems through the Unity 3D engine.
Why do automobile companies like to use 3D physics engines for in - vehicle visual design? Dianchetong believes there are two reasons:
First, the interactive experience of the three - dimensional scenes in the 3D physics engine is more novel. It emphasizes a sense of technology and more advanced graphics rendering capabilities, giving rise to an interactive system beyond the 2D level. When opening the electric tailgate of a car, the 2D menu interaction is equivalent to "functional interaction", while the 3D desktop can become an innovative "spatial interaction" or a scene - based experience.
(Image from XPeng Motors)
For SR interfaces, navigation interfaces, etc., 3D interaction provides a more intuitive sense of "spatiality" compared to traditional 2D interfaces. For example, the data transmitted in real - time by cameras and lidars can be displayed through the 3D interface.
The 3D intelligent cockpit has changed from a "bonus" to a "must - have". KPMG predicts that the market size of China's intelligent cockpit will reach 212.7 billion yuan in 2026, with a penetration rate of over 82%. Automobile companies are using 3D engine technology to create differentiated experiences.
Second, the cooperation between automobile companies and physics engine suppliers can improve the development efficiency of 3D intelligent cockpits. Taking the Unreal Engine as an example, Epic Games and Qualcomm have jointly optimized the performance of the virtual engine on the Snapdragon platform. Unity has jointly developed automotive - grade solutions with domestic chip manufacturers such as Xinchi and Xinqing, improving the hardware utilization efficiency and accelerating the implementation of technology.
Overall, automobile companies mainly use 3D physics engines for in - cockpit entertainment and interactive experiences. Tesla's introduction of the Unreal Engine may have the same purpose, but with a slightly different direction.
Can the Vehicle - Side SR Compete with AAA Games? Cool but Probably Useless
Among electric intelligent vehicle manufacturers, Tesla is one that doesn't pay much attention to 3D visuals, and the proportion of 3D elements in the in - vehicle large screen is relatively small. According to the revelations of the above two bloggers, Tesla is more inclined to use the physics engine to improve the SR screen under ADAS rather than upgrade the interactive interface, which can be seen from the picture materials they provided.
To be honest, Dianchetong believes that the improvement of 3D visuals on the in - vehicle interaction efficiency is negligible. Take the 3D scene desktop as an example. The car model and the scene occupy most of the display area of the in - vehicle system, and the actual use doesn't become more convenient. At best, the interaction between humans and machines has a more "affinity". Of course, this is just Dianchetong's subjective view, and automobile companies also offer other desktop modes.
(Image from XPeng Motors)
Back to the SR interface of ADAS, will Tesla make the SR interface look like what the bloggers revealed after introducing the Unreal Engine? Dianchetong's answer is no. From the perspective of the tool's attributes, the vehicle - side ADAS SR interface doesn't need such complex scene rendering for now. At least, it's difficult to achieve the ideal effect at this stage.
First, let's talk about the vehicle - side SR interface. The common practice in the industry is to use a monochromatic 3D scene for display. The environmental information received by the vehicle's cameras and lidars is used to identify surrounding objects (such as cars, trucks, pedestrians, and electric bikes) through AI and presented through the 3D scene. Dianchetong believes there are two reasons for using a monochromatic scene:
First, the SR interface presents the result of the vehicle's recognition of the surrounding environment. Its purpose is to help the driver understand the surrounding environment and what the vehicle has recognized. The advantage of a monochromatic scene is that it eliminates various color interferences and only leaves the most basic modeling, resulting in higher information transmission efficiency.
(Image taken by Dianchetong)
In the words of some traditional car owners, flashy screen elements may even affect their normal driving.
Second, the restoration of such a complex 3D scene definitely requires a huge performance overhead. As mentioned before, to support the smooth operation of LOTUS HYPER OS, Lotus Cars installed two Qualcomm Snapdragon 8155 chips in the ELECTRE. The more realistic the scene simulation, the greater the demand for the cockpit chip. It's understandable that Tesla pre - installed the Unreal Engine in models such as Model S and Model X equipped with AMD chips first.
Another problem is that limited by factors such as lidars, cameras, and processing performance, the environmental information fed back in real - time by the perception system is prone to errors.
The simplest example is that when using ADAS, you may see that elements such as surrounding vehicles and pedestrians twitch to some extent, and sometimes some elements that don't exist in the real world may appear out of thin air. This is the limitation of real - time perception and real - time rendering at present. This problem is more obvious in Tesla models with pure visual perception.
Imagine that in a very realistic vehicle - side SR interface, you see the vehicles and pedestrians around twitching, or the system tells you there are people in an area where there is actually no one. It sounds a bit creepy. Even if the cockpit chip can easily support real - time rendering of 3D scenes, the information from the perception end needs further optimization and limitation by algorithms to improve the accuracy of the display end.
Moreover, the driver can see the real environment in front through the glass. Generally, there is no need for such a realistic 3D scene restoration. The current monochromatic 3D scene is already an ideal form for the vehicle - side SR interface at this stage.
Conclusion
Tesla's introduction of the Unreal Engine may not necessarily be for a realistic ADAS SR interface. In Dianchetong's view, advanced PBR materials and real - time blurred interface rendering also require the help of a physics engine.
These are just personal subjective speculations. Maybe Tesla really has the ability to solve the contradiction between perception and the vehicle - side SR, blur the boundary between the real and digital worlds, and provide a good experience, setting an example for the industry.
From the user's perspective, the practical value brought by realistic 3D scene rendering is still limited. Or rather, the industry hasn't discovered a must - have scenario yet. During driving, the vehicle needs to accurately perceive and judge the environment, but users don't need to understand it through the in - vehicle large screen. 3D elements are more of an emotional value that shows off the brand's strength.
(Image from Tesla)
This doesn't mean that 3D physics engines are a false demand. They do play an important role in intelligent vehicles. Unity 3D and the Unreal Engine have deep - involved in the graphic optimization of in - vehicle chips through cooperation with chip suppliers, forming the ecological dominance in the era of intelligent vehicles and becoming the core force for automobile companies' intelligent transformation. Even Tesla, which doesn't pay much attention to 3D visuals, has some involvement in 3D car models and 3D intelligent driving scenes.
In the more distant future, automobile companies will also use the autonomous driving simulation platform built by 3D physics engines and game engines to achieve rapid training and simulation of real - world driving scenarios. Dianchetong has also reported and analyzed this.
How should the physics engine develop in intelligent vehicles? Perhaps, just as smartphone operating systems emphasize human - machine interaction, the physics engine can better simulate the movement trajectories of objects, helping automobile companies create more "breathing" menu interaction effects, which may be more practical than a vehicle - side SR comparable to AAA games.
However, how automobile companies use the physics engine depends on their own ideas.
This article is from the WeChat public account "Dianchetong". Author: Dianchetong. Republished by 36Kr with permission.