HomeArticle

Breaking News | YJIC Released HBM3/3e High-Bandwidth Memory Chips, Bandwidth Reaches 800GB/s Level

乔钰杰2026-04-14 09:20
Break through the monopoly of overseas giants in this field.

Author | Qiao Yujie

Editor | Yuan Silai

Recently, Shenzhen Vision Intelligence Storage Technology Co., Ltd. released its HBM3/3e high-bandwidth memory chips, offering two capacity specifications of 12GB and 24GB, with a bandwidth of up to 819GB/s, in line with the JEDEC international standard system.

Product specifications of Vision Intelligence Storage's HBM3/3e (Image source/Enterprise)

HBM (High Bandwidth Memory) is regarded as one of the key components in the current AI computing power system. Against the backdrop of the continuous expansion of large models and the increasing load of training and inference, the bottleneck problem between computing chips and memory bandwidth has become increasingly prominent, which is commonly referred to as the "memory wall" in the industry.

HBM achieves higher bandwidth and capacity per unit area than traditional DDR through the stacking of multiple layers of DRAM dies and TSV (Through-Silicon Via) interconnection, and has become one of the mainstream configurations for AI accelerators and high-performance computing systems. According to YOLE's prediction, the global HBM market size will exceed $46 billion in 2026 and is expected to approach $100 billion in 2030, with a compound annual growth rate of about 33%.

Previously, the global HBM market was long dominated by three manufacturers, SK Hynix, Samsung, and Micron, which accounted for more than 95% of the market share. In recent years, the technological strength of China's HBM-related industrial chain has been significantly improved, and many bottlenecks generally recognized by the market are gradually being broken through. Against this background, the product advancement of Vision Intelligence Storage can be regarded as an important phased progress of domestic manufacturers in the HBM field.

The newly released HBM3/3e products feature a 1024-bit data bus design, representing an order-of-magnitude improvement compared to the 64-bit interface of DDR5. The products have two major differential advantages: First, by optimizing the design of the core circuit voltage domain, the overall power consumption is reduced by 20%; Second, with the TSV redundancy layout and repairable design, the chip manufacturing yield is increased by about 8%, and nearly one-tenth of the wafer cost can be saved under the same production capacity.

Advantages of Vision Intelligence Storage's HBM3/3e products (Image source/Enterprise)

In terms of the team, Vision Intelligence Storage was founded in 2023 and focuses on high-bandwidth memory chips (HBM). The founding team began to participate in the R & D of the previous generation of HBM technology around 2016 and is one of the earliest engineering teams to enter this field. Currently, the company has established multiple R & D centers in China, and its team members include engineers from storage manufacturers such as Micron and Elpida, with a certain international background.

In terms of business model, the company adopts the Fabless model of "chip design + wafer foundry + packaging and testing", and the entire supply chain is supported by Chinese suppliers. It is worth noting that currently, domestic memory chip design companies still mainly focus on traditional DRAM, and there are relatively few participants in the high-end HBM field. Against this background, Vision Intelligence Storage's HBM product design covers the complete link from DRAM Die to Base Die, and it fully holds the intellectual property rights related to the logic and storage parts, with relatively prominent core R & D and design capabilities.

On the application side, the company's current HBM3/3e products can already be applied to the training and inference scenarios of the entire AI industry. As the demand for AI computing power overflows, HBM has also started to explore scenarios such as in-vehicle computing and edge devices. Currently, Vision Intelligence Storage has started customized development for scenarios such as automotive in-vehicle, mobile wearables, embodied intelligence, and unmanned devices. In the future, diversified products with low power consumption, high reliability, and small capacity but high bandwidth will be gradually launched.

From the perspective of technological evolution, HBM is still evolving rapidly. The product evolution roadmap disclosed by the company shows that in 2027, customized HBM and HBM + HBF fusion architecture will be launched, providing TB-level capacity solutions for large model inference; in 2028, HBM4/4e is planned to be launched, with the bandwidth of a single chip increased to 2.5TB/s; in 2029, HBM5 and in-memory computing products are planned to be launched to explore the architectural direction of "computing close to data".

Product planning roadmap of Vision Intelligence Storage (Image source/Enterprise)