The biggest problem with the Type-C interface is that it seems to have been "standardized".
Some time ago, I bought a pair of lithium-ion AA rechargeable batteries. These batteries are lighter and have a higher voltage than traditional nickel-metal hydride rechargeable batteries, and they come with a built-in Type-C charging port.
Lithium-ion rechargeable batteries with Type-C
I thought I had finally gotten rid of that bulky nickel-metal hydride battery charger. However, I found that I couldn't charge the batteries properly after I got them, so I contacted the after-sales service. However, the customer service's response left me speechless:
Please use the included A to C cable for charging. This product does not support C to C cable charging.
I stared at that shoddy black A to C cable and fell into deep thought: Why aren't Type-C interfaces universal?
Type-C doesn't equal Type-C
While searching for a solution, I came across a video. The video mentioned that a newly released C to C adapter could solve the problem of Type-C devices not supporting C to C cable charging. This thing has a strange name, the "5.1K resistor adapter", and it sold out as soon as it was put on the market. The official video was flooded with comments asking for restocks.
I had only heard of adapters like Lightning to Type-C and micro USB to Type-C before. I never thought I'd see a Type-C to Type-C adapter of the same form factor. So while waiting to grab one, I also discussed issues such as Type-C interface standardization, charging, and data transmission with netizens. Only then did I understand the root cause of why the rechargeable batteries I bought couldn't be charged at the beginning. To sum it up in one sentence, the device didn't set the identification resistor according to the USB design specifications. The charger doesn't know whether to discharge power, so it can't supply power to the device.
This situation is quite common in some small household appliances, such as handheld fans, portable table lamps, and small flashlights. They all have Type-C interfaces, but they can only be powered by A to C cables.
So why don't manufacturers design according to the standard? What should the official specifications be? Here's a brief introduction to the Type-C design specifications.
Introduction to Type-C specifications
The Type-C interface has rich functions. It can support high-power charging and discharging, transmit audio and video signals, and support reversible insertion... Because of its rich functions, its actual structure is also relatively more complex.
Type-C pin definitions
A complete Type-C interface has a total of 24 pins, with the A side and B side being mirror-symmetrical. According to their functions, they can be simply divided into four parts: power supply, data transmission, control, and auxiliary.
Power supply
VBUS: A4, A9, B4, B9 → Responsible for power supply, default 5V, up to 48V, depending on the protocol
GND: A1, A12, B1, B12 → Ground wire, forming a circuit and ensuring stability
Data transmission
Low-speed channel: D+ / D- (A6, A7, B6, B7) → Basic data communication for USB 2.0 (480 Mbps)
High-speed channel: TX / RX (A2, A3, B2, B3, A10, A11, B10, B11) → Used for high-speed data communication such as USB 3 / USB 4 / Thunderbolt
Control (the most critical)
CC: A5, B5
Determine the insertion direction
Determine the power supply direction (who supplies power)
Negotiate current and voltage
Activate fast charging / video mode
Auxiliary
SBU: A8, B8 → Used for audio or video auxiliary signals (such as DisplayPort)
As mentioned above, the lack of an identification resistor means that the pull-down resistor (RD) with a resistance of 5.1K at the CC position is missing. So the device cannot be recognized as a "power receiving end", and the charger will not start supplying power. This 5.1K resistance value is also the standard Rd value stipulated by USB-IF.
However, the problems with Type-C are not as simple as just lacking a "pull-down resistor".
A unified appearance, a fragmented heart
Type-C is indeed a very good interface form, but it's still a long way from achieving the USB-IF's vision of "achieving universal, simple, and unified device connection and interoperability".
Scaled-down interfaces
In fact, it's rare to use all 24 pins. Most merchants will make cuts according to the actual situation. For example, general small household appliances will remove the pins related to data transmission and only keep the charging part. Six pins are enough - this is a very reasonable cost control strategy.
In fact, many devices used to have micro USB interfaces. Since the USB-A at the charger end is defaulted as the power supply end, there's no need to negotiate the power supply direction like with Type-C. So there are no relevant identification resistors in the device circuit. After replacing with a Type-C interface, some manufacturers didn't adjust the original device circuit to save costs, so they can't be charged with a C to C cable.
In other words, these cables just have a Type-C shell, but inside they're still the familiar micro USB.
A Type-C female socket with only 4 pins
For example, the Type-C female socket in the picture above has only 4 pins. It provides D+ / D- for USB 2.0 low-speed data transmission and VBUS, GND for power supply, but there are no corresponding CC pins. So devices using this female socket can't be charged with a C to C cable.
There are also some female sockets with CC pins, but the manufacturer didn't solder the 5.1K identification resistor. So some netizens with strong hands-on abilities soldered the identification resistor themselves to make the device support C to C charging.
Manually soldered identification resistor
Different supported power levels
If only considering charging, the charging speeds of C to C cables with the same appearance can vary greatly. Take my personal situation as an example. My power bank can activate the 90W fast charging of my Xiaomi phone with the original C to C cable, but some other cables can only reach a maximum of 20W. If you don't understand this, the high-wattage charger you bought at a high price may always be operating at a low power level.
To achieve a charging power of 60W or higher, you need to choose a cable that supports 3A or a higher specification.
A cable supporting 6A current
Expensive
Now many monitors support the one-cable connection function. You only need a C to C cable to connect your computer and the monitor, and it can transmit video and charge your laptop, making your desktop neater.
But friends who often deal with this scenario should also know that "one-cable connection" doesn't work with just any C to C cable. You need a Thunderbolt 3 or higher standard or a full-function USB-C cable. These cables are several times or even more than ten times more expensive than ordinary C to C cables.
Original iPhone cable, 6A cable, full-function USB-C cable
Proliferation of private charging protocols
You can say that the above problems are caused by different hardware specifications, which is a cost issue. However, many domestic mobile phone manufacturers have developed their own private charging protocols, which is a problem at the protocol layer.
Domestic mobile phone manufacturers started competing in charging power as early as 2014, with the power increasing from 60W, 90W, and even reaching 100W. At that time, the official PD charging specifications couldn't meet the needs of domestic manufacturers, so they began to research their own private high-speed charging protocols. Well-known ones include OPPO's VOOC, Huawei's SuperCharge, and Xiaomi's HyperCharge. These manufacturers have indeed achieved high-speed charging by modifying the protocol, and they are far ahead of international brands like Apple and Samsung.
However, private protocols require a dedicated charger + a dedicated cable + their own mobile phones to achieve full-speed charging. Once you change the brand or have multiple phones, the charger and cable won't be compatible, and you can only charge at a speed of 18W or even lower. On some high-power chargers, these protocols may conflict with the original PD protocol, resulting in negotiation failures, power reduction, or even repeated handshaking.
So in essence, private protocols create new "ecological barriers" on top of the "unified interface" of Type-C.
Confusing official naming
In addition to the various cuts and modifications by device manufacturers that make the hardware specifications and protocols inconsistent, the multiple adjustments of the specification naming by USB-IF have further increased the user's understanding cost:
In 2008, USB-IF launched the USB 3.0 standard.
In 2013, USB 3.1 was released. The original USB 3.0 was renamed USB 3.1 Gen 1, and USB 3.1 was called USB 3.1 Gen 2.
In 2017, USB-IF renamed the USB 3.2 standard again. USB 3.1 Gen 1 was changed to USB 3.2 Gen 1, USB 3.1 Gen 2 was changed to USB 3.2 Gen2, and a new USB 3.2 Gen 2x2 (20Gbps) was added.
...
It's already difficult to distinguish Type-C data cables from their appearance. The official's repeated modifications have made everything even more confusing, and it's even harder for users to tell the differences. So some netizens made the following picture to mock this phenomenon:
Past and present
However, careful friends may notice that we were talking about Type-C before, but now it's USB 3. This is actually a confusion between the interface form and the protocol.
Type-C refers to the interface style, while USB 3 is a specific interface protocol. Currently, the latest USB protocols all use the Type-C interface and are also the most widely used, so many people confuse these two concepts.
Interface and protocol
Finally
A few days later, my "5.1K C to C adapter" arrived home. This little thing allows the charger to recognize the connected device as a power receiving end by supplementing the identification resistor, so that it can supply power.
My problem is solved, but what about Type-C? It seems to have quite a few problems: inconsistent implementation, inconsistent protocol standards, and inconsistent user experience... But these seem to be just the surface. The real problem is that Type-C uses a unified interface form to cover up the complex and fragmented implementation and protocols behind it.
Its problem has never been "lack of unity", but rather that it seems to be unified.