A post-90s former Huawei employee secures nearly 100 million in financing: teaching robots to be cute.
You may wonder: How can a humanoid robot that can't do the dishes or perform labor raise nearly 100 million in financing? But this is exactly what happened recently.
This spring, at the Shanghai AWE exhibition, Niu Tengyu, the founder of Qingxin Yichuang, noticed a detail.
A little girl squatted in front of a small robot wrapped in a furry exterior for forty minutes. When she was finally pulled away by her mother, she kept looking back.
This robot is called Amoo and is from Qingxin Yichuang. It can't do the dishes, serve tea, or carry boxes. What it does is very simple: blink, turn its head, jump around, and respond to people with body movements.
But it's precisely these robots that seem "useless" that make Niu Tengyu more and more certain of one thing: The first stop for embodied intelligence to enter the home may not be labor, but companionship.
In the past two years, most robot companies have been competing on "whether they can do work": whose hands are more dexterous, whose movements are more complex, and who can enter factories, warehouses, and service scenarios.
Qingxin Yichuang has chosen a different path: First, let the robot enter real families and real lives as a "living character."
Recently, Qingxin Yichuang completed nearly 100 million yuan in financing, jointly led by Houxue Capital and Tianji Capital, with existing shareholder Lepu Capital participating. Since its establishment in November 2023, it has raised funds in three rounds.
Pencil News recently interviewed Niu Tengyu, the founder of Qingxin Yichuang. Some highlights are as follows:
1. What is the most suitable use for a robot in the home?
Answer: Not labor, but providing emotional value.
2. What is the first challenge for a companion robot?
Answer: Jerky and unsmooth movements.
3. How can a companion robot impress users?
Answer: Not by talking, but by body language.
4. Which products will be quickly eliminated?
Answer: Those that are mediocre, just a shell, or disposable.
After reading this article, you will gain an understanding of which are the opportunities and which are the pitfalls for AI companion robots.
The First Hurdle for Robots to Enter the Home
I'm a post - 90s. I got my doctorate from the University of Cambridge and used to work in the core team of Huawei's autonomous driving.
Many people regard humanoid robots as the next - generation smartphones and new energy vehicles.
Although it's not an exaggeration, it misses a premise: People already wanted mobile phones and cars before they appeared. But what exactly do people need robots for?
Today, most companies are competing on whose hands are more dexterous, whose movements are more complex, and whose control is more precise. This path is correct, but I've always been thinking about another question:
Is there a scenario where robots can be used in the real world without such high precision?
Yes, the answer is emotional interaction.
If a robot is used for work, the margin of error is very low. For example, when picking up a cup, if it fails, the cup will break. But if it's used to provide emotional value, at most it's just "not that cute" and won't cause any loss to the user.
In this way, the "data" problem that has been plaguing the industry is also solved. Because providing emotional value doesn't require that much data, and it can generate its own data (by entering family life).
Why has autonomous driving been getting better and better? Because the market for cars is large, and cars running on the road naturally collect a large amount of real - world traffic data.
But robots don't have that much natural data, and not many robots are bought by people and taken home. Data has to be collected specifically, which is extremely costly.
But if a robot can enter a home through "emotional interaction," it's different.
It talks to users, makes expressions, and moves around every day. Over time, it will remember what the user said, what expressions they made, and how they reacted. These "memories" allow it to train itself through real - life interactions and become better at responding to the user's emotions.
In this way, the robot becoming "more understanding and smarter" happens naturally during the interaction.
Recently, more and more people in the industry are starting to mention concepts like "entering the home" and "emotional interaction." We started working on this a year and a half ago (at the end of 2024) when not many people believed in it.
The First Obstacle: Jerky Movements
Of course, there are still some practical problems for robots to enter the home, such as the long - standing problem in the industry: jerky movements.
Human movements are naturally smooth and fluid. At a party, when you're gesturing wildly and suddenly someone calls you, you'll naturally turn your head and stop without having to put your arms back in place first.
But most robots on the market finish one action, pause, and then do the next.
Why? Because the common practice is to assign a separate "switch" to each action. To change actions, you have to turn off the previous one and then turn on the next one.
For example, if your hand is holding a cup, you have to put it down before doing something else.
This "reset - then - continue" method works fine in factories. The robotic arms on the production line don't need to chat with people or look at people's expressions. But when it comes to robots entering homes and interacting with people, it's awkward.
When we were doing emotional interaction, this was the first problem we encountered.
We wanted the robot to look "alive," to be happy, curious, and turn to look at you. But at first, the movements were jerky, like a toy with low battery.
This problem has been plaguing the industry for a long time.
The ideal approach is to redesign the entire control system: use a "master switch" to manage dozens of actions at the same time, allowing the model to learn a set of actions on its own and learn how to smoothly switch between different actions.
We first tested this concept on Orca. After verifying its feasibility, we iterated to develop an "emotional gait large model" with richer and smoother emotions.
With this foundation, and by adding multi - modalities such as eye contact and voice, Amoo's vivid emotional interaction naturally emerges.
How Can Robots Move People? Body Language
If robots are to enter homes in the future, the sense of being alive is very important, but many people may not realize it.
In mid - March this year, we took the robot "Amoo" to the Shanghai AWE exhibition. A little girl hugged the robot for forty minutes and didn't want to let go. Finally, her mother had to drag her away.
Later, we found that similar scenarios often occur in nursing homes, special education schools, and the homes of young people living alone.
These people don't care whether the robot can sweep the floor or cook. What they care about is: Does it have expressions? Will it jump up and down when it's happy? Will it scratch its head when it doesn't understand what you're doing?
I wondered: Why do people develop feelings for a machine that can't do work?
Later, after reading many psychology books, I found a classic conclusion: When people talk to each other, only 7% of the information is conveyed through language, 38% through tone of voice, and the remaining 55% through body movements.
This is why people can develop deep feelings for cats and dogs.
Applying this conclusion to the field of embodied intelligence, we'll find a problem: Many companies making emotional - interaction robots focus all their efforts on "talking," thinking that as long as the robot can chat, it's enough.
But what really touches people's hearts is often not language, but those silent things - actions, expressions, and body postures.
For example, characters like Doraemon, Paddington Bear, and those in Toy Story touch you because they have a "sense of being alive": they can be happy, sad, curious, and afraid. You can understand them at a glance without them saying a word.
Therefore, when making emotional - interaction robots, the technical focus shouldn't only be on the "brain" - making it smarter and better at chatting - but also on the "body."
We first match the semantic layer with the physical layer, so that the robot's eye contact, voice, and body movements can work together. Then we let it test repeatedly in realistic scenarios - being able to express emotions through actions and smoothly switch states, ultimately making you feel that it's "alive" and warm.
To achieve this goal, many things from hardware to algorithms have to be redone.
For example, to put a furry exterior on the robot, we had to preserve the agility of its movements without affecting the accuracy of the sensors. There were almost no ready - made solutions to refer to.
What to do? We searched for six or seven hundred suppliers back and forth and spent half a year on it. Finally, a team cooperated with us to explore from scratch and we finally got it right.
Three Types of Products Face Elimination
It's said that the AI companion industry will undergo a major reshuffle in 2026. I think three types of companies are the most at risk:
The first type is the "shell - type." They connect a large model to a simple hardware and quickly launch a product. They can attract attention with their appearance in the short term, but they don't have their own technology, and can be easily copied by Huaqiangbei.
The second type is the "mediocre - type." Desktop robots and wheeled robots, whether you buy A, B, or C, they all work similarly. They're just fast - moving AI hardware products, and the novelty wears off quickly.
The third type is the "throw - away - after - a - few - days - type." Users buy them but don't use them continuously, so there's no data and no way to iterate and upgrade. They have no stickiness.
The companies that can survive must have their own technology, a workable data cycle, and a real understanding of users. The core is simple - whoever gets enough robots into homes first will have the fuel to train the next - generation models. This is also the path we're exploring.
Compared with verifying market demand, implementation is actually easier. In China, there's no need to worry about hardware being copied. As long as the downstream demand increases, the cost can be quickly reduced. Even for small - sized joint modules, although there are no ready - made ones on the market now, as long as the demand is clear, domestic suppliers can quickly catch up.
From the end of this year to the beginning of next year, emotional - interaction robots will be able to enter homes on a large scale.
We hope that the product can achieve "high - quality companionship" in the short term. As the scale expands and the supply chain improves, the cost will drop significantly, and it will gradually become a "mass - market tool."
The robot may not be able to do your dishes yet, but it's learning another thing - how to become a part of your life.
This article does not constitute any investment advice.
This article is from the WeChat official account “Pencil News” (ID: pencilnews), written by Wu Xinxiao, and is published by 36Kr with authorization.