HomeArticle

Pinching your fingers will be the main form of future interaction.

爱范儿2025-08-11 08:00
The next generation of intelligent interaction is lying right on your wrist now.

Have you ever used the "Double Tap" feature on Apple Watch?

Tap your index finger and thumb together twice in the air, and you can control your Apple Watch on your wrist to answer calls or switch songs. It seems a bit sci - fi and rather abstract. At first, I thought this was just a feature for Apple to show off its technological prowess.

It wasn't until I wore an Apple Watch S9. When I was busy cooking and tried this gesture to answer a call, I completely became a loyal fan of this interaction. Double - tapping has now become a muscle memory for my left hand.

After I wore another Apple product, the Vision Pro, I made an even more astonishing discovery: Apple's ambition has never been limited to just the small screen on the wrist.

Source: MacRumors

Is Apple Watch the instruction manual for XR?

If you hand an iPad to a child who has never contacted any smart products, you'll find that they can use it without any pressure in less than half an hour.

The reason is simple. The touch screen is a very intuitive way of presentation and interaction. Basically, you just tap where it lights up.

This is also why generative AI based on large - language models is so popular today: because natural language is also an interaction method with extremely low barriers. As long as you can speak, even if not very well, you can control the machine effectively. In complex interactions, sometimes speaking is more efficient than traditional touch.

As a highly anticipated next - generation interaction, the situation of XR mixed reality is completely different. Users need to learn a brand - new set of gesture - based operations in the air, and these gestures are relatively abstract and don't always directly correspond to interface operations.

However, when I wore the Vision Pro, I felt an unusual sense of familiarity: the action of tapping the index finger and thumb together to select content is almost identical to that on the Apple Watch.

This is not a forced comparison. In fact, in the settings of the Apple Watch's Double Tap function, there is a switch labeled "Ignore when using Apple Vision Pro" to avoid accidental touches when using both devices simultaneously.

In addition to the Apple Watch, Apple has introduced somatosensory interaction in more and more products in recent years.

For example, the head - shaking and nodding gestures on AirPods and the eye - tracking feature on the iPhone are actually very suitable for smart glasses.

Conversely, these new features may also be the result of technology transfer from the Vision Pro, because the eye - tracking feature on the iPhone mainly appears as an auxiliary function.

However, as a result, more and more technologies and functions from the Vision Pro are filtering down to more mainstream Apple devices. This phenomenon is subtly influencing users' habits and reducing their learning costs for future spatial computing devices.

Moreover, using a watch or bracelet as a navigation method for XR devices is not a new concept, and Apple's peers have also conducted research on this.

Another XR giant, Meta, has been researching how to use a bracelet to control computers, mobile phones, and even AR glasses. For example, when Meta announced its future - oriented glasses project, Orion, last year, it also demonstrated a smart bracelet that could be used in conjunction with it.

However, gesture control is not Meta's ultimate goal. They published a paper in the journal Nature, discussing the possibility of a bracelet "reading minds": after wearing the bracelet, users don't need to perform any physical actions. As long as they "think" about doing certain actions, the bracelet can read these thoughts and complete the corresponding operations.

Source: The New York Times

It sounds a bit sci - fi, but in fact, it's an old - fashioned technology: "electromyography" (EMG), which can collect electrical signals from the neurons of the forearm muscles. This technology has been around for decades, and one of its previous killer applications was to help amputees control prosthetic limbs.

When the forearm muscles are active, neurons in the spinal cord generate electrical signals. These neurons are directly connected to individual muscle fibers, so the generated electrical signals are particularly strong and can be read directly from the outside of the skin. The transmission speed of these electrical signals is even faster than the actual muscle movement, which allows Meta's bracelet to even predict what actions the user will take next.

Almost at the same time as the bracelet was revealed, there was also news about a smartwatch being developed internally at Meta. Like Meta's glasses, it has a camera, but it's not clear whether it has EMG capabilities.

The technology behind Apple Watch's gesture recognition is slightly different from EMG. It uses an accelerometer, a gyroscope, and an optical heart - rate sensor to detect the tiny movements of fingers and changes in blood flow, accurately capturing the user's real input intentions.

In January this year, a patent for Apple's exploration of EMG technology was also disclosed, which integrates sensors into a small Apple Watch. The patent also further demonstrates the possibility of the EMG watch working in conjunction with headsets, earphones, glasses, tablets, and mobile phones.

Compared with Meta, Apple has a significant advantage:

Users may not be willing to wear an unattractive bracelet that can only be used to operate AR glasses, but they are very willing to wear an Apple Watch.

Even the form of the Apple Watch itself is paving the way for the popularization of smart glasses.

In an interview between ifanr and Apple executive Kurt Knight, he emphasized the attribute of the Apple Watch as a mini "computer".

From this perspective, we are already accustomed to wearing a "computer" on our bodies for a long time, and the next step is smart glasses.

Practical and Feasible Split - type AR

The current AR glasses model, Meta Orion, consists of three components:

  • Glasses worn on the head, serving as the display interface
  • A wristband worn on the wrist for somatosensory interaction
  • The "brain" responsible for computing functions - a main unit about the size of a TV remote control

Compared with the all - in - one Vision Pro, by having different components each take on a part of the functions, Meta Orion has truly created a lighter and more comfortable - to - wear AR glasses for long - term use.

If we try to match these three components in form or function with Apple's ecosystem, we'll find that the Apple Watch can serve as the wristband for somatosensory interaction, and the iPhone itself is a computing center with its own operating system and high - performance capabilities.

- In this sense, Apple is only short of a pair of display glasses to create a product like Meta Orion.

In fact, in terms of AR display glasses, there is no shortage of such products on the market. Companies like Thunderbird and XREAL have been working in this field for many years, and they have already launched some excellent products.

XREAL One Pro

XREAL has also collaborated with Google and demonstrated Project Aura this year. It is a pair of AR glasses that need to be used in conjunction with a computing unit, setting a reference benchmark for the Android XR camp.

This means that the distance between us and a real pair of AR glasses is not as far as we thought - in fact, the technology is almost ready.

According to Bloomberg, after the Vision Pro received a lukewarm response, Apple is indeed exploring the creation of a pair of AR smart glasses that must be used in conjunction with an iPhone or Mac and are specifically designed for display. These glasses are expected to be launched next year.

There is also a piece of news that seems unrelated at first glance but actually has deeper implications when considered together:

Apple is preparing a "desktop mode" for the iPhone. By connecting an external display via the USB - C port, it can provide an interface similar to Stage Manager, which is very suitable for presentations and viewing.

Samsung DeX mode, Source: The Verge

This mode is not new, and the existing products in the market have received mediocre responses. So, when the news that the iPhone is testing related functions came out, it was really confusing.

However, if we think differently, what if this mode is not specifically for external displays but for AR glasses?

A colleague at ifanr who owns a Samsung phone tried using a Samsung phone with an external AR glasses and said the experience was quite good, as if they had a portable computer.

On the Apple side, users may be able to use the iPhone directly as a trackpad for input, or unlock a complete XR gesture - operation experience through the Apple Watch. The glasses can also be equipped with a camera for eye - tracking input.

Ifanr has previously discussed and believes that the iPhone and smartphones have evolved into composite devices and will not be replaced by next - generation computing terminals such as smart glasses for a long time.

Therefore, before the technology and form of smart glasses are fully mature and can function as independent personal computing devices, it is a more practical and accessible solution to continue using the iPhone as the computing core, the glasses as the dedicated AR display entrance, and the Apple Watch as the gesture - input method.

This approach also allows Apple to gain a foothold in the XR market based on the large user base of the iPhone and Apple Watch.

Moreover, this solution is not exclusive to Apple. The Android XR camp can completely replicate or even surpass it. There are rumors that the future Android operating system will have the Android XR system built - in and support interaction with Wear OS watches.

Source: How - To Geek

Qualcomm, which has been focusing on the XR platform in recent years, recently launched a new AR1+ Gen 1 chip. In addition to more powerful performance and a smaller size, it also indicates that the chip will support cooperation with smart rings as an input method for the Android XR system.