StartseiteArtikel

Mit bloßer Gedankenkraft kann man das Handy bedienen. Das Gedankensteuerungsgerät von MIT ermöglicht es, ohne Mund- oder Handbewegungen zu operieren, und die "Gedankenlese"-Genauigkeit beträgt 92%.

量子位2025-09-10 10:38
Es ist auch möglich, eine stille Zweierkonversation zu führen.

Can you give commands to your phone just by using your mind without moving your mouth or hands?

Please watch the VCR:

Now, even if two people don't speak the same language, they can "talk" through their minds. The words will be directly translated into the other person's language and output through bone-conduction headphones.

A startup team from MIT has launched a non-invasive wearable device that enables humans to "speak" with their minds.

This wearable device allows humans to write, create, and communicate without any physical movements anytime and anywhere. It can even help people with special speech impairments regain their voices.

The R & D team said that they created this wearable device to extend human thinking and allow everyone to easily explore their own world.

The Smart Wearable Learns "Mind Reading"

The name of this wearable device is AlterEgo, which comes from Latin and means "another self".

AlterEgo is a wearable silent voice interaction platform that allows users to interact bidirectionally with computing devices without making any sound or obvious movements.

This means that users can "speak" in their minds as if they were talking to themselves, and the system can understand and process these "silent" inputs with a vocabulary accuracy rate of up to 92%.

It can also provide feedback to users through bone-conduction headphones, directly transmitting the information to the user's ears without disturbing the external environment, providing a complete input - output interaction experience.

AlterEgo supports users to control various applications through silent voice. For example, users can silently recite a mathematical formula in their minds, and the device will perform the calculation and provide the result. In addition, users can also set reminders, arrange schedules, etc., and the system will provide real - time feedback through bone - conduction headphones.

For instance, you can point at a postcard in your hand, ask for information about the person in the picture, and then ask it to remind you to write a reply letter.

If two people wear AlterEgo at the same time, they can communicate with each other through their minds.

Moreover, AlterEgo features a non - invasive design. There is no need for any implantation operations. It is easy to wear and can be connected to other devices via Bluetooth.

So, how does AlterEgo work specifically? The key technology can be traced back to a paper published by the team in 2018.

The principle of the AlterEgo system is based on the capture and analysis of neuromuscular signals.

When humans express themselves verbally, they usually activate a series of facial and neck muscles. These muscles also produce tiny movements when making internal vocalizations (such as silent recitation or silent pronunciation).

These movements cause electrical activity in the muscle area, generating neuromuscular signals. AlterEgo identifies the user's intentions by capturing these weak muscle electrical activity signals, and these signals do not require the user to make any sound or obvious facial movements.

The captured neuromuscular signals are time - varying potential differences. These signals need to go through a series of signal processing steps to extract useful information:

  • Denoising: The original muscle signals may contain noise or false signals caused by environmental interference. The system uses filtering algorithms to remove unnecessary interference and ensure the clarity of the signals.
  • Feature extraction: After the signals are filtered, the system extracts features from the signals to obtain feature vectors that can represent speech.

The extracted features are input into a CNN network for classification. Once the system recognizes the silent voice, it will calculate and generate the corresponding output.

Built by a MIT Startup Team

The AlterEgo project originated in the MIT laboratory in 2018.

Arnav Kapur is one of the main founders of the AlterEgo system. Before developing AlterEgo, he conducted technical research in multiple fields, including biomedicine, artificial intelligence, and human - computer interaction.

His research goal is to enhance the natural interaction between humans and computers through wearable devices, especially to achieve silent voice recognition through the collection of neuromuscular signals.

AlterEgo is one of his doctoral research projects at MIT.

Utkarsh Sarawgi is another important contributor to the AlterEgo project. He is also a doctoral student at the MIT Media Lab. His research interests mainly focus on fields such as human - computer interaction, wearable devices, and embedded systems.

He developed the AlterEgo system together with Kapur and played a key role in the project, especially in the hardware design and signal processing of the system.

Eric Wadkins is also a core member of the AlterEgo project. He once served as a research assistant at the MIT Media Lab.

In the AlterEgo project, like Sarawgi, Wadkins is mainly responsible for signal processing and hardware implementation.

He participated in the development of the device for capturing neuromuscular signals and optimized the system's performance and wearability.

In addition, Wadkins also participated in a collaborative study with multiple sclerosis (MS) patients to explore the application of the AlterEgo system in helping patients with speech impairments communicate silently.

Professor Pattie Maes of the MIT Media Lab is the guiding tutor of the AlterEgo project. Her research fields cover a variety of cutting - edge technologies, including human - computer interaction, augmented reality, artificial intelligence, and computer - aided design.

Currently, the project was spun off into a for - profit company at the beginning of this year to continue promoting the commercial application of the technology.

Reference Links:

[1]https://x.com/alterego_io/status/1965113585299849535?s=46

[2]https://www.media.mit.edu/projects/alterego/overview/

This article is from the WeChat official account "QbitAI". Author: Keleixi. Republished by 36Kr with permission.