HomeArticle

Embodied intelligence is surging, and lidar orders are booming: Leading players achieve a 600% annual growth with shipments exceeding 200,000 units.

智能车参考2025-08-09 11:41
The leading players in the lidar industry are creating “eyes” for robots.

A new wave is approaching, and often the supply chain takes the lead.

In the wave of popularization and implementation of intelligent vehicles, lidar has become a must - have configuration for intelligent vehicles, just like a "seat belt".

Fixed - point projects, shipment volume, exploration of technological frontiers... Good news keeps pouring in. Every earnings season is a fixed stage for them to break records.

However, in the new wave of the development of intelligent vehicles, the role of the key supply chain has once again been clearly and firmly demonstrated.

At the just - opened WRC World Robot Conference, lidar has shown its value and function far beyond "safety redundancy".

New scenarios and new applications are also new engines for the iteration of lidar technology.

From the era of automotive robots to the era of embodied intelligent robots, the role and position of lidar remain unchanged.

For the mass production of embodied robots, LiDAR takes the lead.

Embodied intelligence is in full bloom, and lidar takes a step ahead

Robots are actually the spill - over and extension of the software and hardware technologies of autonomous driving and intelligent vehicles.

Whether it is a driving task or more complex living and working skills, the front - end needs to perceive the environment, and the back - end needs a brain that can understand and recognize the scenario and autonomously plan and control the motion mechanism - the VLA multi - modal large model.

Therefore, from the underlying computing chips, to the basic model architecture, and then to the front - end perception devices, autonomous driving and embodied intelligence are highly reusable. This is why most robot startup teams are composed of senior autonomous driving teams.

More straightforwardly, an intelligent vehicle is a kind of four - wheeled robot.

But at the WRC exhibition, you can still intuitively feel the biggest difference between robots and intelligent vehicles:

The only mission objective of an intelligent vehicle is to "go anywhere", but the challenges for robots are much more arduous: "do everything".

For example, Yu Yinan, the former top person in charge of Horizon's intelligent driving, founded a robot startup company called Vita Dynamics, and its first product is an intelligent quadruped robot dog focusing on companionship and assistance:

In addition to being able to interact with people naturally through language and actions, it can help you carry heavy loads when you go out to buy groceries, go on an outdoor outing or climb a mountain:

It can also be a private photographer for free recording and follow - up shooting:

Of course, you can also set the "dog - walking mode". While autonomously perceiving and exploring the environment, it can also respond to your instructions:

Another example is the super - anthropomorphic service robot "Xingdong Q5" of Xingdong Jiyuan, which can act as an all - around "shopping buddy":

It can easily handle basic tasks such as carrying bags, serving plates, delivering water, moving things, and pressing elevator buttons. It can also perfectly handle tasks like passing through narrow roads, fetching items from high places, squatting to polish shoes, and bending down to pick up things.

This year's popular Unitree Technology unveiled its latest quadruped robot, Unitree A2, at the WRC. It weighs about 37 kg, can walk continuously for 3 hours and 12.5 km when fully loaded, and 5 hours and 20 km when unloaded. It is lighter, faster, and more powerful than the previous generation:

It can be applied in almost all security, inspection, rescue and other scenarios in harsh environments.

The application scenarios of robots in a broad sense are even more diverse. For example, MOVA has launched two lawn mowing robots - MOVA 600 and MOVA 1000. They can obtain complete environmental information at one time during the mapping process and can cover a lawn perception area of nearly 2000 square meters in 0.1 seconds.

It landed overseas in March this year and swept the European lawn mower market to become the sales champion in just two months...

The "delivery robot" - JD Logistics vehicle is also equipped with multiple Hesai lidars:

The products at the WRC vary in form, function, and landing scenarios, but they all have one thing in common:

They have the perception ability of real - time, high - precision, and complex scenarios.

This is more directly reflected in the Unitree A2 of Unitree Technology, which has two lidars at the front and two at the back:

The same "equipment" can also be found on robot products such as those of Vita Dynamics, MOVA, and Xingdong Jiyuan:

JT series - which is very different from the lidar products of intelligent vehicles, is produced by Hesai Technology and specifically designed for the demand scenarios of robots.

Is lidar also a necessity for robots?

With this question in mind, Auto Intelligence Reference had a chat with Si Xinquan, the senior director of Hesai's robot perception business.

He told us that the role and significance of lidar for vehicles and robots are actually different.

For intelligent vehicles, lidar requires a higher detection range. It is necessary to figure out the situation of target obstacles as early as possible under certain speed conditions, which is equivalent to a "seat belt" that ensures the reliability lower limit of the ADAS system.

However, whether they are quadruped, humanoid, or wheeled or tracked robots, they all work within a certain space. Only the types and states of target obstacles in this space are much more disordered and complex.

Therefore, positioning, navigation, and obstacle avoidance are the core capabilities of robots. In the era without lidar, the solution was GPS + camera. However, the vision - based solution always has the risk of missed detection and false detection, and GPS does not work well in environments such as office buildings and shopping malls.

The lidar, which can cover 360° without dead angles and accurately depict the size, shape, and positional relationship of targets, has become the most important "eyes" of robots.

This is why the JT series of lidar products developed by Hesai for robots do not use solid - state or semi - solid - state technology, but instead choose mechanical lidar with higher maturity.

The JT series uses a full - rotation scanning method in the horizontal direction to obtain a 360° horizontal field of view. It can achieve up to 256 lines in the vertical direction. Through Hesai's innovative optical module design, the vertical field of view reaches 187°.

The perception range is equivalent to the size of 1.5 standard football fields. In addition, it has achieved a minimum detection distance of 0 meters, realizing no blind spots in close - range perception.

This breakthrough technological upgrade not only provides robots with comprehensively upgraded all - round perception ability but also takes into account both ground detection and environmental structure perception of the upper space, greatly improving the positioning accuracy and obstacle recognition ability of robots.

Smart you must have thought of another solution: Why use the "backward" mechanical structure instead of using multiple solid - state lidars to "enclose" a 360° detection range, just like the phased - array radar arrangement on warships?

Si Xinquan gave two reasons.

Firstly, from an economic perspective, the cost of three solid - state or semi - solid - state products is much higher than that of one mechanical JT. It's okay for industrial robots, but for consumer - grade robot products, such a solution is difficult to mass - produce on a large scale.

Secondly, from a technological perspective, the "mechanical" nature of Hesai's JT series products is no longer the mechanical lidar in the stereotypical impression, which is hundreds of thousands of dollars in price and as large as a flower pot:

Behind this is the breakthrough in Hesai's self - developed lidar chip technology.

When competing in the ADAS and L4 tracks in the early years, Hesai was the first to propose the chip - based route. The AT128, which was mass - produced in 2022, uses a transceiver - end chip - integrated architecture that reduces the overall product volume to nearly half of the traditional solution and doubles the point - cloud density.

The JT series uses the latest fourth - generation chips, which integrate the VCSEL array at the transmitting end, the 3D stacked SPAD array at the receiving end, and the signal processing unit on a single chip respectively, achieving a minimalist chip architecture solution that integrates sensing, storage, and computing.

The JT on the Xingdong Q5 and the Vita robot dog is about the size of a lychee.