首页文章详情

Two new disciples of He Kaiming at MIT are exposed: for the first time, a female student has joined the group, and the other is the inventor of FNO. Both are of Chinese descent.

量子位2025-11-06 15:13
The girl is a top student in the ACM class of Shanghai Jiao Tong University.

The homepage of AI expert Kaiming He has been updated with information about two new disciples.

Both are Chinese and have excellent academic records. Ph.D. student "Keya Hu" + Postdoctoral fellow "Zongyi Li".

So far, among the 6 students recruited by Kaiming He since he started teaching at MIT, 5 have Chinese faces.

And these two new Chinese members who joined the group this time also shine all the way.

"Keya Hu" from Shanghai Jiao Tong University

Keya Hu graduated from Shanghai Jiao Tong University with a bachelor's degree.

In high school, she studied at the well - known key high school, The High School Affiliated to Fujian Normal University.

In 2021, Keya Hu was enrolled in the well - known ACM Class at Shanghai Jiao Tong University, majoring in computer science.

According to the official account of Zhiyuan College of Shanghai Jiao Tong University, since her junior year, Keya Hu has been a member of the Brain - Computer Interface Laboratory (BCMI) at Shanghai Jiao Tong University, under the guidance of Professor Weilong Zheng.

During that time, she set her research direction in AI for Science - hoping to combine AI with brain science, process raw electroencephalogram (EEG) signals through self - supervised learning, and thus help patients with depression and other groups troubled by mental health issues.

After months of refinement, she completed a paper titled "Contrastive Self - supervised EEG Representation Learning for Emotion Classification" as the first author.

This result was accepted by the top international conference on biomedical computing, EMBC, and she was invited to give an oral report in the United States.

Meanwhile, she also participated in a project to improve the effect of self - supervised learning as a co - author. This paper was also successfully included in the top conference Cognitive Science 2025 later.

During the summer vacation of her junior year, Keya Hu went to Cornell University for an internship. Under the guidance of Professor Kevin Ellis and Ph.D. student Hao Tang, she participated in a research project aiming to improve the efficiency of program synthesis and code repair, and she was responsible for the design and implementation of the core algorithm.

The result was finally accepted by the world's top machine learning conference, NeurIPS 2024, and Keya Hu was the second author.

Finishing one project was not enough. Keya Hu joined hands with Ph.D. student Wen - Ding Li and turned their attention to the then - highly - regarded AGI open competition - ARC Prize 2024.

The concept of ARC was initially proposed by François Chollet to evaluate the learning and reasoning ability of AI when facing new problems, rather than winning by memory or repetitive training for specific tasks.

ARC Prize 2024 was carried out around this benchmark, requiring participants to submit algorithms to solve a set of never - seen - before tasks.

In other words, to win, the participating models must be able to show thinking ability close to that of humans in the scenario of "few - shot, abstract reasoning".

The competition criteria were strict, but the temptation was equally great - the total prize money exceeded $1 million, so it attracted 1430 teams from around the world to participate.

To stand out in this tough battle, Keya Hu led the development of a key method: automatically generating datasets through program synthesis, fine - tuning large language models on this basis, and combining test - time finetuning technology.

Facts have proved that this method can significantly improve the performance of the model.

After fierce competition, the result of Keya Hu's team reached the state - of - the - art (SOTA) level in the competition and won the "Best Paper Award".

After that, she sorted out this research result as a paper as the co - first author and successfully published it in the top machine learning conference, ICLR 2025.

Don't forget, at this time, Keya Hu was just an undergraduate student.

Just during her undergraduate years, she had already accumulated four highly - valuable papers, and half of them were with her as the first author.

Such a resume naturally made her a "hot commodity" when applying for a Ph.D. - it is said that MIT, Princeton, Carnegie Mellon, Cornell, and the University of Washington all offered her admission.

Finally, she chose MIT for a direct Ph.D. program.

Currently, Keya Hu is a first - year Ph.D. student in the Department of Electrical Engineering and Computer Science at MIT, jointly supervised by Kaiming He and Jacob Andreas.

She focuses on the interdisciplinary research of language and vision at MIT, hoping to create intelligent agents that can use data more efficiently and have stronger generalization ability.

The other is the inventor of FNO: "Zongyi Li"

Another new disciple of Kaiming He, Zongyi Li, was already a little well - known in the AI academic circle.

In 2021, when he was still a Ph.D. student, Zongyi Li published a high - profile paper as the first author -

"Fourier Neural Operator for Parametric Partial Differential Equations".

You may not be familiar with the name, but it was this paper that proposed the later - well - known Fourier Neural Operator (FNO), and for the first time, realized the large - scale application of "neural operators" in a real sense.

However, before talking about FNO, we need to figure out a question: What is a neural operator?

To put it simply, a neural operator is a neural network that can learn to "solve physical equations".

During training, scientists will feed it a large amount of data related to physical equation calculations, such as: different initial water temperatures → the time required to cook instant noodles; different throwing angles and speeds → the final trajectory and landing point of a basketball...

By learning these samples, the neural operator will eventually develop a set of "general rules", just like mastering a higher - order multiplication table.

In the future, when encountering similar data, it doesn't need to calculate step by step, but can instantly solve the "most difficult problem" by "intuition".

For example:

If you want to predict where a storm will move, according to the traditional method, even a supercomputer has to work for several hours;

But if you use a neural operator, just input the air pressure, temperature, and wind speed of the day, and it can calculate the path of the storm within milliseconds.

This is like a straight - A student with a photographic memory who can write the answer on the answer sheet directly from the knowledge base after just looking at the question, without having to calculate step by step on paper.

(Well, if you do this in the college entrance examination, you will still get a score of 0.)

With this "instant calculation" ability, neural operators have amazing effects in fields such as weather forecasting, carbon sequestration, and aerodynamics simulation.

More importantly, it enables AI to achieve generalization at the level of physical laws for the first time, thus becoming a key bridge connecting machine learning and basic sciences.

Zongyi Li's research focus is exactly in this field, and he took a key step in this direction - the FNO they proposed moved the traditional spatial convolution to the frequency domain, using Fourier transform to process data, which doubled the running speed of the neural operator.

It was this breakthrough that made FNO regarded as a milestone model in the field of "AI for Science", and Zongyi Li is also recognized as one of the core contributors in the field of neural operators, with more than 12,000 citations on Google Scholar.

Currently, Zongyi Li is a postdoctoral researcher at MIT, supervised by Professor Kaiming He.

However, his time at MIT may not be long -

It is reported that he has already obtained the position of Assistant Professor at New York University and will start working there in the fall of next year.

Zongyi Li is from Beijing. He studied at The High School Affiliated to Renmin University of China and then went to the United States to study.

During his undergraduate years, he majored in both computer science and mathematics at Washington University in St. Louis and also minored in jazz.

In 2019, he went to California Institute of Technology to pursue a Ph.D., under the supervision of Anima Anandkumar and Andrew Stuart. These two supervisors witnessed the birth of FNO together.

In addition, during his Ph.D. years, Zongyi Li also interned at NVIDIA for three consecutive summers.

What is Kaiming He doing? Digging deep into AI for Science

Actually, in 2023, Associate Professor Kaiming He said in his job - seeking speech at MIT that "AI for Science" would be the direction he would focus on in the next few years.

Now, looking at his team lineup, it really makes sense.

The two new members - Keya Hu and Zongyi Li, one did scientific research at the brain - computer interface laboratory of Shanghai Jiao Tong University, and the other is a leading figure in the field of neural operators, which echoes strongly in this direction.

Adding the previous Mingyang Deng, Xingjian Bai, Tianhong Li, and Jake Austin, Kaiming He has now gathered six favorite students, which can be described as a "luxurious lineup".

However, Quantum Bit heard that the original plan this year was actually for seven people.

The last candidate is a boy with an equally impressive resume. There are even rumors that his recommendation letter was written by a well - known professor at the same level as Kaiming He and was strongly recommended.

However, he has not appeared in the introduction of Kaiming He's team yet.

As one of the inventors of ResNet, since Kaiming He left Meta and entered the academic circle in 2024, more outstanding young people have also benefited - young scholars have the opportunity to advance basic AI research and frontier breakthroughs more deeply without ROI constraints.

Actually, the four members of the ResNet team back then are all cultivating the new generation in their own ways.

Kaiming He's main job is a professor at MIT, but he also has industrial cooperation with Google DeepMind;

Xiangyu Zhang's main job is the chief scientist at Jieyue Xingchen, but he is also a part - time professor at his alma mater, Xi'an Jiaotong University;

Shaoqing Ren's main job is the vice - president of intelligent driving at NIO, but this year he just became a professor at the artificial intelligence laboratory of his alma mater, the University of Science and Technology of China, recruiting students and giving lectures...

It is believed that the new generation of AI geniuses are among the people they are mentoring -

Great teachers and outstanding students, the tradition continues.

Reference links:

[1]https://lillian039.github.io/

[2]https://zongyi - li.github.io/assets/pdf/Zongyi_CV_July2025.pdf

[3]https://www.facebook.com/lizongyijohnny/?locale=zh_CN

[4]https://zongyi - li.github.io/

[5]https://mp.weixin.qq.com/s/xTXjyE3MFLHGpVpJEXIrYA

This article is from the WeChat official account "Quantum Bit", author: Jay, published by 36Kr with authorization.