首页文章详情

While some humanoid robots are still "looking for a job", Olaf from Disney has already started "working".

盒饭财经2026-03-24 11:55
Olaf, stands out vividly.

Do you still remember the Olaf robot on the stage of NVIDIA GTC 2026?

Yes, it's that snowman Olaf from the Disney movie "Frozen". It's the IP - themed robot that just interacted with Jensen Huang on stage.

Image source: Screenshot of NVIDIA GTC2026

It's going to start working soon.

Taking small steps and waving its little twig - like arms, Olaf, the toughest worker in Disney, wobbles in to work. Seeing a group of people standing outside the railing, it takes the initiative to greet each of them. When it sees someone pointing a camera at it, it immediately raises its eyebrows, widens its eyes, spreads its arms, and tilts its head, posing for a photo.

Image source: YouTube screenshot

On major social media platforms at home and abroad, many netizens reported that Olaf has appeared at Disneyland Paris and is demonstrating within the railing area, showing up every 30 minutes. When meeting tourists, the Olaf robot moves smoothly and naturally throughout, with rich and vivid expressions, just like it has stepped out of the screen.

Unexpectedly, Disney has found a new job for the embodied intelligence industry.

According to the official website of Walt Disney Imagineering (WDI), on March 29, the Olaf robot will officially open for business for the first time at Disneyland Paris. It will also be introduced to Hong Kong Disneyland later.

Kyle Laughlin, the senior vice - president of R & D at Walt Disney Imagineering (WDI), once said that in the past, they usually had to wait several years after a movie was released to launch the corresponding offline interactive experience. Currently, with the support of reinforcement learning technology, audiences can see the corresponding character robots (launched by Disney) right after watching the movie.

Through the Olaf robot, Disney has made it possible for animated IP characters to enter the real world. This also just proves that robots can not only be productive tools to liberate human hands but also be the characters themselves, a carrier and continuation of IP narratives.

The Olaf Robot with a "Sense of Life"

The "sense of life" of the Olaf robot is very prominent.

In addition to being able to walk and make all kinds of funny expressions, the Olaf robot can also talk to people and sometimes even crack humorous jokes.

While walking, it will take the initiative to say hello and introduce itself: "Hi everyone, I’m Olaf."

The classic line from the movie, "I like warm hugs!", has also been brought into reality.

When it can't find its way back, it will feel embarrassed and say, "This is embarrassing." But this social butterfly is never worried about getting lost. It will ask people for help: "Which way is north?" Then it will continue to wobble in the direction pointed by the passer - by.

On the stage of the GTC conference, Olaf also had a real - time interaction with Jensen Huang, the CEO of NVIDIA.

When Jensen Huang introduced Olaf's origin, the Olaf robot playfully said, "I was just about to say that." After hearing a lot of introductions about itself, it blurted out, "I’m a snowman, not a snowclopedia."

This comeback was quite smooth.

Olaf's ability to have such smooth conversations with people is thanks to the large number of pre - recorded lines by the staff.

Kyle Laughlin mentioned in an interview on the tech news blog platform gizmodo that they invited Josh Gad, the voice actor of Olaf in "Frozen", to pre - record a large number of lines and humorous jokes. In actual conversations, Olaf is still remotely controlled by a real person. The operator will select appropriate lines to reply according to the conversation between tourists and Olaf.

After its appearance at the GTC conference, Olaf also received a lot of praise online: "I really want to have one!" "This is the kind of robot that should be on the Spring Festival Gala." "I think it's even cuter than the animation."...

Whether in appearance or conversation, the robotic Olaf is no different from the Olaf in "Frozen", just like a living being that has escaped from the "Kingdom of Arendelle". Netizens have also given the Olaf robot a very vivid new name - the Living Olaf.

This "sense of life" mainly benefits from the Innovative Mechatronic Design and Deep Reinforcement Learning technology. The former creates a "physical body" that restores Olaf's original appearance, while the latter allows Olaf to learn to stand and walk by itself in the virtual world without programmers writing codes manually.

The Disappearing Symmetry: The "Space Cube" Hidden Under the Foam Skirt

The "sense of life" of Olaf can also be intuitively felt in its limbs.

Imagine if you were to take off your right leg and attach it in the opposite direction (with the knee facing backward), while keeping your left leg unchanged. How would you walk? It would feel extremely strange. In the human world, this is a very absurd operation because you simply can't walk or even stand steadily like this.

However, this behavior that goes against human walking common sense has become one of the keys for Disney to successfully create the "physical body" of the Olaf robot.

Why design the leg structure of the Olaf robot in this way? It's due to its unique body structure.

With the current technological level, it's not difficult to create a "physical body" for a robot. The difficult part is that Olaf is not a "human".

Olaf in "Frozen" is an animated snowman character. It doesn't have legs, and its body consists of three independent snowballs of different sizes. It has a big head, a small neck, a carrot nose, twig - like arms, and body parts such as eyebrows that can be freely disassembled and assembled.

Image source: D23

From a scientific perspective, in the physical world with gravity and friction, Olaf's structure goes against the laws of physics. Three independent snowballs stacked together will roll down under the influence of gravity without support, especially when the head is large. A snowman without legs can't walk either. Moreover, in the movie, Olaf's body can be squeezed and stretched at will. To fit different movie plots, Olaf can slide on the ground like a ball or walk like a human.

These designs of Olaf cannot exist as an upright - walking individual in the real world.

Therefore, if Disney's R & D team wants to turn the anti - physical - law Olaf in the movie into a real - world upright - walking robot, they can't design it like a humanoid robot.

In order to make the robot look like Olaf in the movie, Disney's R & D team hid all the mechanical parts and used a polyurethane foam shell to shape the snow - white appearance of the Olaf robot. In addition, they adopted an innovative mechatronic design to shape the body hidden in the "foam skirt".

Image source: Disney official website

The legs of traditional humanoid robots have a mirror - symmetric structure, but the lower body space of Olaf is limited, and it's not possible to fit two sets of drives side by side. Therefore, Disney's R & D team designed an asymmetric six - degree - of - freedom leg structure. The directions of the hip joints and knee joints of its two legs are opposite. That is, the hip joint drive of the left leg faces backward and the knee faces forward, while the right leg is the opposite. They are interlocked like a Rubik's Cube.

Image source: YouTube screenshot

Under this structural design, the hip joints and knee joints of Olaf's two legs are staggered to avoid collisions between the drives at the joints.

The movement of Olaf's arms, eyes, and mouth is remotely controlled by a precise linkage system.

What does this mean?

Robots don't have motor nerves. The movement of their legs, body, arms, eyes, mouth, and other organs is driven by actuators, that is, motors. Each movable joint needs a motor to support it. Just as a car needs an engine to run, a robot needs a motor to walk and make various movements.

Image source: Disney Research Imagineering

However, due to Olaf's special body structure, it's not possible to install motors directly on some joints. So, the motors have to be placed in the torso, and linkages are used to drive the joint movement.

For example, there isn't enough space in the shoulder of the Olaf robot to install an actuator. So, Disney's R & D team placed this actuator in the torso, that is, Olaf's belly. In addition, a spherical five - bar structure (a "closed - loop circuit" composed of five linkages that can rotate freely up and down and left and right) is used to drive the arm. The rotation of the motor drives the linkages, and the linkages push the entire arm to move at the shoulder joint like fingers moving a puppet.

Image source: Screenshot of NVIDIA GTC 2026

A single actuator is used to drive both the upper and lower jaws of the mouth. The lower jaw is directly driven by the actuator (that is, a motor is directly connected to the joint of the chin), and the upper jaw is driven by a four - bar mechanism coupling. That is, a set of four - bar linkage mechanical devices connects the upper palate and the chin. When the chin opens, this linkage structure will automatically pull the upper palate to produce a subtle displacement or deformation, making it look like Olaf is opening its mouth to speak.

It's worth noting that the twig - like arms, carrot nose, buttons, eyebrows, and hair of the Olaf robot are all magnetic, perfectly matching the funny image of Olaf in the movie, whose arms and nose can be pulled off and re - attached.

According to the technical paper released by Disney's R & D team, the finally created Olaf robot is 88.7 cm tall (excluding hair), the same size as Olaf in the movie, weighs 14.9 kg, and has 25 degrees of freedom (that is, there are 25 "joints" that can move independently throughout the body), of which the two legs have a total of 12 degrees of freedom. No wonder Olaf walks briskly and gracefully.

A Million Rebirths in the Cyber Space

When Olaf was learning to walk and stand, like most robots, falling was the norm. However, the difference is that it didn't have a "physical body" at the beginning and directly learned through internal and external coordination.

Like humans, the Olaf robot also has a "brain" and a "cerebellum".

However, the "cerebellum" of the Olaf robot is an "AI brain". This "AI brain" is not a large - language model like ChatGPT that we are familiar with in daily life, but a Deep Reinforcement Learning (DRL) strategy model. The deep reinforcement learning model allows AI to continuously "try and make mistakes" and obtain rewards in a virtual and complex environment, and autonomously learn how to achieve the goal.

For example, before Olaf was trained, it was like a "brain - underdeveloped" baby, and its "AI brain" was just a bunch of meaningless random numbers.

What would happen if this "AI brain" were put into Olaf's body at this time?

Since it hasn't learned how to stand and walk, Olaf would twitch in place like having an epileptic seizure or just fall flat on the ground.

To prevent the Olaf robot from being damaged while learning to walk in the real environment, Disney's R & D team independently developed a simulator called Kamino based on the Newton simulation framework to train robots like Olaf that need to walk in complex environments.

In this simulator, Disney engineers generated thousands of digital clones of Olaf and let them learn to stand and walk inside.

Image source: YouTube

Specifically, how was it done?

Image source: Gemini

First, the engineers fed the actions drawn by the animators as "perfect - score models" to the AI, allowing the AI to train itself in the virtual world with the goal of walking like the given "perfect - score model". It's just like students aiming for a perfect score in an exam.

After setting the goal, it's time to start learning to walk. In this process, PPO (Proximal Policy Optimization) was used. It