A Long Article of Ten Thousand Words: Artificial Intelligence Won't Make You Rich
Change is real and yet entirely predictable.
Entrepreneurs and investors can amass wealth only when revolutionary technologies give rise to waves of innovative enterprises with investment value. Whether it was the railroad, the Bessemer steel - making process, electricity, the internal combustion engine, or the microprocessor—each technology was like a stray spark in a fireworks factory, igniting decades of subsequent innovation, permeating all aspects of society, and catapulting a new batch of inventors and investors to the pinnacles of power, influence, and wealth.
However, some technological innovations, despite having a transformative impact on society, struggle to generate new wealth and instead reinforce the existing status quo. Fifteen years before the advent of the microprocessor, another revolutionary technology—container shipping—emerged in an “inopportune time”: the technological progress at that time was like the “Red Queen's race” in *Alice's Adventures in Wonderland*. Inventors and investors ran ceaselessly but ultimately failed to achieve better results.
Anyone investing in the “new and novel” must answer two questions: First, how much value can this innovation create? Second, who can capture this value? The value of the Information and Communication Technology (ICT) revolution was captured by startups, giving rise to thousands of newly - minted billionaire founders, employees, and investors. In contrast, the value of the container shipping revolution was overly dispersed, ultimately only enabling one founder to become wealthy briefly and only one investor to make a modest profit.
Is generative AI more like the former or the latter? Will it become the cornerstone of wealth for numerous industries in the future, or will it lead to an overall net loss for the investment community, with only a few local zero - sum game winners?
There are indeed profitable ways to invest in AI achievements, but the prerequisite is to accept the reality that, for inventors and investors, the current stage is still “inopportune.” AI model builders and application companies will ultimately form an oligopoly in the competition, and the benefits brought by AI will flow to customers rather than technology builders. Therefore, the large amount of capital pouring into the AI field is flowing in the wrong direction. Apart from a few lucky early investors, those who can profit will be those with foresight who withdraw early.
The Microprocessor: From “Calculator Accessory” to a Revolutionary Era
In 1971, engineers at Intel invented the microprocessor, but they initially did not realize the revolutionary nature of this technology—at that time, they just wanted to avoid designing desktop calculator chip - sets from scratch every time. However, outsiders discovered that microprocessors could be used to build personal computers, and tech enthusiasts were the first to put this into practice. Thousands of “geeks” explored configurations and uses that Intel had never envisioned. As economist Carlota Perez said, this “distributed, permission - less” innovation triggered a “massive development wave” driven by economic and social forces, although initiated by technology[1].
In the early 1970s, there was no real demand for personal computers; they were more like expensive toys. But early explorers laid the technological foundation and built communities. Around 1975, the “step - like decline” in the cost of microprocessors made the personal computer market viable:
- The initial price of the Intel 8080 chip was $360 (equivalent to $2300 today). Even when the “Altair” computer launched by manufacturer MITS was sold at a bulk price of $75 (equivalent to $490 today), it was hardly profitable.
- When MOS Technology launched the 6502 chip at a price of $25 (equivalent to $150 today), Steve Wozniak could afford the manufacturing cost of the Apple prototype.
- The 6502 chip and the similarly - priced Zilog Z80 chip forced Intel to cut prices. Entrepreneurs began to emerge in the emerging personal computer community, and dozens of companies were established, each launching slightly different products.
In the mid - 1970s, no one could have predicted that personal computers (and similar products such as ATMs, POS terminals, and smartphones) would completely change the world. At that time, Steve Jobs told investors that “every family will have a personal computer in the future” (which turned out to be a serious underestimate), while others questioned “whether there was a need for personal computers.” As late as 1979, Apple's advertisements still did not explain “what a personal computer could do”; instead, they asked users “what you would use it for”[2]. Established computer manufacturers such as IBM, HP, and Digital Equipment Corporation (DEC) had no interest in “products that customers did not need”—no one “needed” a computer, so personal computers were “sold” rather than “bought.”
Glamorous startups like Apple and Sinclair attracted attention through hype; companies with a foundation in consumer electronics, such as Atari, Commodore, and Tandy/RadioShack, used their strong retail channels to reach potential customers.
Commonalities of Technological Waves: Time, Patience, and “Self - Reinforcing Cycles”
The growth of the personal computer market was slow in the early days and only accelerated after the emergence of “practical applications” such as spreadsheets in 1979. As the use cases increased, “observing actual applications” reduced market uncertainty, which in turn promoted more enterprises to adopt, forming a “self - reinforcing cycle.” Every technological wave needs time to build momentum: for example, it took nearly 30 years for electricity to reach half of American households, and it also took a similar amount of time for personal computers to become popular [3].
When a technological revolution is about to change everything, it requires a large amount of innovation, investment, narratives, time, and pure “hard work”—it also absorbs all available capital and talent. Just like the “Kuhn's paradigms” in the scientific field, any technology that does not belong to the “techno - economic paradigm” of this wave will only become a “side show”[4].
The emerging personal computer market attracted venture capitalists, who began to place high - risk bets on new companies. This trend in turn inspired more inventors, entrepreneurs, and researchers, which further attracted more speculative capital.
IBM, the industry giant before the rise of personal computers, performed relatively weakly at this time—it neither believed that personal computers would exist in the long term and penetrate its core market nor bothered to pay attention to the “emerging small market seeking cheaper solutions.”
In hindsight, we always regard personal computer pioneers as “prophets,” but at that time, except for a few early adopters, almost no one paid attention to this field. Mainstream media such as *The New York Times* did not start taking this technology seriously until August 1981 when IBM launched its personal computer. In the entire year when Apple was founded in 1976, *The New York Times* only mentioned “personal computers” four times[5]. Obviously, only “the crazy, the odd, the rebels, and the troublemakers” paid attention to this field at that time.
Then and Now: The “Surprise Gap” between AI and Personal Computers
When comparing the early days of the computer revolution with the present, the most significant difference is the “element of surprise”: in the 1970s, no one cared about personal computers; in 2025, AI has become a topic of national discussion.
Large companies dislike “surprises,” and “uncertainty” is the best moat for startups. When IBM entered the personal computer market in 1979, Apple might not have survived—it was able to continue competing thanks to the $100 million raised through its IPO (Initial Public Offering) in 1980. After the industry “screening” triggered by IBM, Apple became one of the few remaining competitors [6].
As the technology took hold and demonstrated its potential, innovations in peripherals such as software, memory, floppy disk drives, and modems followed, reinforcing each other: every improvement put pressure on adjacent technologies. If a part of the system lagged behind, investors would rush to inject capital into that field. For example, the improvement of personal computer memory gave rise to the demand for complex software, which in turn required more external storage—this prompted venture capitalist Dave Marquardt to invest in disk drive manufacturer Seagate in 1980. When Seagate went public in 1981, it brought Marquardt a 40 - fold return. Other investors followed suit, and in the next three years, about $270 million flowed into the industry [7].
Capital also flowed into underlying infrastructure (fiber - optic networks, chip manufacturing, etc.) to ensure that “capacity would never become a bottleneck.” Companies that used new technologies to surpass established enterprises began to seize market share, and even conservative competitors realized that “not adopting new technologies meant extinction.” The hype gradually turned into an investment bubble—the Internet frenzy in the late 1990s.
The ICT wave was similar to previous technological waves (such as the investment booms after canal construction in the 1830s and railway construction in the 1920s): human reactions at each stage predictably gave rise to the next stage.
After the burst of the Internet bubble, society began to oppose excessive speculation in the industry, and the government gained public support to re - impose controls on technology companies and investors, putting the brakes on the “mania.” Enterprises no longer innovated blindly as they did during the bubble period but turned to proven markets; financiers shifted from “speculation” to “investment”; entrepreneurs no longer focused on underlying technological innovation but on finding application scenarios. Technology continued to progress, but the changes were more “incremental” rather than “revolutionary.”
After the slowdown of change, enterprises had the confidence to make long - term investments and began to integrate various parts of the system in new ways to create value for a wider range of users. Infrastructure such as the over - built fiber - optic communication networks during the bubble period provided a large amount of cheap production capacity, reducing the cost of expansion—this was a golden era for entrepreneurs and investors.
The “Uniqueness” of AI: A Revolution without “Room for Trial and Error”
Unlike the Internet, society has started criticizing AI without waiting for the “bubble to burst.” Although the “backlash” against technology has lasted for a decade and we are used to it, the opposition faced by AI is in sharp contrast to the early days of ICT (when tech giants such as Bill Gates, Steve Jobs, and Jeff Bezos were widely admired).
The world dislikes change. The reason why technology was “tolerated” in the 1980s and 1990s was that people at that time thought that “if technology went wrong, it could be reversed”—this gave early computer innovators “room for trial and error.” Today, everyone knows that “computers are indispensable,” and AI no longer has the “luxury of waiting and seeing”; instead, it is regarded as “part of the ICT revolution.”
Economist Perez divides each technological wave into four predictable stages: the irruption stage, the frenzy stage, the synergy stage, and the maturity stage, each with unique investment characteristics.
For investors, the middle two stages (the frenzy stage and the synergy stage) are relatively easier to handle:
- Frenzy stage: Everyone rushes in, and investors profit from “taking high risks for unproven ideas,” which ultimately ends with the burst of the bubble and the disappearance of book profits.
- Synergy stage: Rationality returns, enterprises optimize their products to make them suitable for a wider range of users, and investors with patience, the ability to screen, and the ability to provide “non - financial value” will be rewarded.
Investment in the irruption stage and the maturity stage is more difficult.
Investment in the Irruption Stage: Harder Than It Seems in Hindsight
Investment in the 1970s was far more challenging than it appears in retrospect. From 1971 to 1975, to invest in the personal computer - related field, one had to be either a “staunch believer” or a “conglomerate adopting a blind diversification strategy”:
- Intel was an excellent investment target, but in the early days, it just seemed like a “company from the previous wave of the electronics industry.”
- MOS Technology was founded in 1969 to compete with Texas Instruments. Later, to stay in business, it had to sell most of its equity to Allen - Bradley.
- The investor in Zilog in 1975 was actually Exxon.
- Although Apple had great potential, it completely did not meet the “standard screening criteria” of venture capitalists—at that time, personal computers were still a “solution looking for a problem.”
It was not until the early 1980s (the later part of the irruption stage) that a large number of high - quality investment opportunities emerged: personal computer manufacturers (Compaq, Dell), software and operating system companies (Microsoft, Electronic Arts, Adobe), peripheral manufacturers (Seagate), workstation enterprises (Sun Microsystems), computer stores (Businessland), etc. If one invested in the winners, the returns would be substantial, but the fact that “there was more capital than ideas” meant that this was not the golden age of investment. By 1983, there were more than 70 companies competing in the disk drive field alone, and valuations dropped significantly.
Some people did accumulate wealth in the 1970s and 1980s, and many venture capitalists became famous during this period. However, the greatest advantage of investors in the irruption stage was to “accumulate institutional knowledge,” laying the foundation for high - quality investments in the early days of the subsequent frenzy and synergy stages.
Investment in the Maturity Stage: The Dilemma of “No Change”
Investment in the maturity stage is even more difficult: in the irruption stage, it is “difficult to predict the future,” while in the maturity stage, “almost no change occurs.” The uncertainty about “what works” and “how customers and society will react” almost disappears, everything is predictable, and everyone's behavior is also predictable.
The lackluster market environment allows enterprises that were successful in the synergy stage to consolidate their positions (such as the “Nifty 50” and tech giants like FAANG), but growth becomes difficult. They start to invade each other's markets, conduct mergers and acquisitions, raise prices and cut costs; the era of “attracting new customers with low prices” ends, and product quality declines; large companies still claim to be “pursuing revolutionary innovation,” but they are more inclined to “control the application of innovation”; R & D spending shifts from “product and process innovation” to “increasingly futile attempts to extend the current paradigm”—enterprises package this as a “battle for victory,” but in fact