Cathie Wood's latest assessment: AI is deeply resonating with four cutting-edge technologies, triggering an unprecedented "great acceleration."
Author | Chang Yuan
Editor | Key Point Jun
On March 13th, Cathie Wood, the founder of ARK Invest and known as the "Wall Street Goddess of Stocks," led her core research team to conduct an in - depth interpretation of the "Big Ideas 2026" report and provided the latest insights into future technological trends.
In this sharing session, ARK Invest focused on the integration of five major innovation platforms: AI, Multiomics, public blockchains, robotics, and driverless taxis. The team believes that the resonance of these five platforms will trigger an unprecedented great acceleration. As a firm supporter of disruptive innovation, Cathie Wood believes that we are in the midst of a full - blown technological revolution, and the seeds of this revolution were sown in the 1980s and 1990s and are now taking root and sprouting.
We have sorted out the core information of this podcast. The following are the key points:
1. AI is evolving from simple text conversations to intelligent agents with long - term execution capabilities
The new - generation computing platform shift brought about by AI is evolving the human - computer interaction interface from keyboards and touchscreens to natural language. More importantly, AI models are experiencing a fundamental inflection point in capabilities: they are no longer limited to handling short 5 - minute tasks that require frequent intervention but can stably and reliably independently execute complex long - term tasks lasting more than 55 minutes.
This significant increase in productivity will bring extremely high return on investment for enterprises. As models gradually generate synthetic data for self - iteration, AI not only helps enterprises reduce costs but also creates pure net revenue. Driven by this strong business demand, the AI software expenditure of up to trillions of dollars will support a huge wave of infrastructure construction for computing power and data centers.
2. The integration of AI and biology is forming a powerful flywheel, leading the medical and health field back to the golden age
Although autonomous driving may be the area with the largest revenue - generating potential in the future, the multiomics and genomics revolution is the most profound and disruptive application of AI. The integration of AI and biology is forming a powerful flywheel. Massive data trains better models, and better models in turn feed back more accurate molecular diagnoses and targeted drugs.
The underlying driving force of this flywheel lies in the precipitous decline in sequencing costs. The cost of human whole - genome sequencing has dropped from nearly $3 billion initially to $100 today and is expected to drop to only $10 by 2030. Biology is becoming the largest data - generating engine on Earth. With the support of AI, the time for new drug development will be shortened by 40%. This paradigm shift will enable one - time curative gene therapies to replace long - term chronic disease management, and the economic value of a single drug may even be 20 times higher than that of traditional drugs.
3. Reusable rockets break the cost limit and open a new era of space infrastructure
According to Wright's Law, every time the cumulative scale of sending effective mass into orbit doubles, the rocket launch cost will decrease by 17%. Thanks to the partial reusability technology of the Falcon 9, SpaceX has currently reduced the launch cost by about 95%.
Once the heavy - lift rockets at the Starship level achieve full reusability, the cost of sending payloads into space will plummet from the current $1000 per kilogram to less than $100. This magnitude breakthrough will not only greatly expand the satellite communication network but also make space - based data centers commercially viable. After crossing a certain cost threshold, the cost of AI computing power in space may even be 25% cheaper than that on Earth, which is limited by land and power grids.
4. Driverless taxis are the first large - scale commercialization of embodied intelligence, reshaping the trillion - dollar transportation market
The public often thinks that autonomous driving is just an auxiliary driving function, but in ARK's view, it is the first large - scale commercialized embodied intelligence that consumers can encounter in daily life. The key to winning this transportation revolution lies in the per - mile operating cost of the underlying vehicles.
In the future, the large - scale operation of driverless taxis is expected to reduce the transportation cost to 25 cents per mile, which is even less than one - tenth of the cost of human - driven ride - hailing services in Europe and the United States. The extremely low price will release an extremely large potential transportation demand. By the end of this century, driverless taxis are expected to create an astonishing total addressable market (TAM) revenue of up to $10 trillion. In this new ecosystem, platform operators with core autonomous driving technology will capture most of the economic value.
5. The integration and resonance of the five major innovation platforms will disrupt the global economic paradigm
In the face of current concerns and polarized debates about over - investment in the technology field, ARK believes that the current wave of AI infrastructure construction is similar to the railway construction wave at the end of the 19th century, which accounted for 75% of the market value of the US stock market.
AI, multiomics, public blockchains, robotics, and autonomous driving - these five major innovation platforms have not only reached their critical points but are also deeply influencing the economic paradigm. For example, the popularization of autonomous driving will convert the hidden human driving time of up to $4 trillion in the United States each year into real GDP growth. This is not a simple industrial upgrade but a complete reconstruction of the global economic underlying engine.
At the end of the video, Cathie Wood emphasized that although disruptive technologies may cause social anxiety about unemployment in the early stage, historical laws prove that the AI era will ultimately be an era of net job creation. With natural language programming greatly lowering the technical threshold, we will witness an unprecedented explosion of individual entrepreneurship. In the future, AI will not only be about computing power and code but will become an infrastructure that empowers human civilization, enabling innovation in all industries to be implemented in the physical world at an unprecedented speed.
The following is a transcript of ARK's video podcast content:
1. Five major innovation platforms lay the foundation for economic growth
Cathie Wood: Hello, everyone. I'm Kathy Wood, the CEO and Chief Investment Officer of ARK Invest. Today, I'm here with the research and portfolio team to present the highlights of our 104 - page "Big Ideas 2026" research report. We believe this is the kind of research that investment bankers did after the advent of PCs in the 1980s and 1990s, aiming to gain insights into the future of technology. The seeds of that technological revolution were sown and germinated back then, and now we are in the midst of a full - scale technological revolution. We need original research to explore the future and understand what it will bring.
I'm very honored to share these results with you today. I'd also like to introduce two new members of our team, who form our Multiomics research group, which is also known as genomics in Europe. Although autonomous driving may be the largest in terms of revenue generation, we believe that the genomics or multiomics revolution will be the most profound application of AI.
Host: Thank you, Kathy. Good afternoon, everyone. I'm very excited to be here to share "Big Ideas 2026" and answer some selected questions from the more than 800 questions you've raised. Next, I'd like to ask, from a macro perspective, how do the various technologies we've observed integrate and lead to the so - called "Great Acceleration"?
Researcher: Currently, five major innovation platforms are entering the market. Among them, AI is the core driving force accelerating the development of all other platforms. The other four platforms include: multiomics, public blockchains, robotics, and energy storage and autonomous transportation, including driverless taxis (RoboTaxi). All five of these innovation platforms are at critical turning points, triggering the largest - scale investment cycle in technological infrastructure since the railway era. This not only affects macroeconomic growth in the short term, but also the investment in data centers and AI agents is changing the business landscape and laying the foundation for continuous economic growth in the future.
As investments yield positive returns, it is expected that the global economy will achieve a real compound growth rate of more than 7% by the end of this decade. Although the outside world generally expects a growth rate of 3%, this is fully in line with the economic historical law that the potential equilibrium growth rate changes during major technological transformation periods. The market will follow the macro trend, and we expect that more than 60% of the total global stock market value will belong to disruptive innovation platforms. The core message I want to convey is: you must embrace innovation. If the assets you hold are not centered around innovation, their value proportion may decline relatively in the next decade. Just as 75% of the US stock market value in the late 1870s was attributed to the railway industry, these five platforms will also see a similar explosion of enterprise value. This is what we call the "Great Acceleration."
2. The wave of AI infrastructure construction
Host: Okay. So the first specific question is: Are we over - building AI infrastructure relative to the existing energy availability? What impact will this have? Also, what are the feasibility and economics of space - based data centers? What technological and commercial milestones need to be achieved for them to be competitive with ground - based alternatives?
Researcher: People are worried not only about energy availability but also about whether it is wise to pour a large amount of capital into the AI field. Our standard for measuring AI performance is how much value it can bring to knowledge workers. Currently, fully using AI can make one unit of work produce 1.5 units of results. Enterprises only need to pay a small fraction of an employee's salary to get amazing business returns.
As AI progresses, we expect that in the baseline scenario, the investment in AI software will reach $7 trillion, which is sufficient to support the construction of more than $1 trillion in data center infrastructure. Although energy is indeed a constraint in some regions (such as central Ohio), many new cloud companies are working hard to solve this problem, and we don't think it is an absolute global limitation. Unlike the situation in the 1990s when a large amount of fiber - optic cables were laid and then remained idle for many years, today, from text - language models to multiomics, autonomous driving, and embodied robots, every GPU is being fully utilized and is even in short supply. Regarding space - based data centers, they are not economically viable on existing launch platforms such as the Falcon 9. However, with the launch of SpaceX's next - generation reusable rocket, the Starship, the cost per ton to orbit may drop to a few hundred dollars. Once this threshold is crossed, space - based AI computing will be more cost - competitive than ground - based computing. This not only solves the problems of data center construction on the ground due to political resistance or energy grid limitations but also enables AI computing to scale up without local restrictions. Elon Musk once said that this is just an engineering problem, and just as he went against the odds to build cars with mobile phone batteries back then, when he focuses on solving engineering problems, he has always proven to be right.
Host: Very wonderful. Next, could you briefly introduce your overall view of the current AI development trend? In terms of revenue generation, in which fields does AI create real net new revenue, rather than just reducing profit margins through increased efficiency? How does ARK distinguish real valuable signals from the hype?
Researcher: We believe that AI is a generational platform transformation, similar to the leap from PCs to smartphones back then. AI is shifting the user interface from keyboards to natural language, enabling users to interact with computers in a new way, making it more user - friendly and powerful. We will see new product forms with built - in AI assistants, such as Meta's Ray - Ban smart glasses. Compared with the Internet and smartphones, AI is being adopted more than twice as fast, reaching a 20% penetration rate in just three years. What drives this process is the significant decline in the cost of AI model training and inference.
In the consumer field, personal AI agents are becoming the primary entry point for accessing Internet services and information, and users are increasingly trusting ChatGPT or Claude. This creates new monetization and business models. AI agents can conduct transactions on our behalf, and the shift of attention will also attract a large amount of advertising funds to these new assistants. For example, integrating Instacart into ChatGPT allows users to take a photo of a recipe, and the AI can help complete 90% of the grocery ordering. This convenient experience that breaks old habits creates incremental revenue that didn't exist before.
In the enterprise knowledge - based work area, since the end of last year, there has been a fundamental turning point in the long - term inference ability of models. The average duration that an AI agent can reliably complete tasks has increased from 5 minutes to more than 55 minutes without constant human supervision. This greatly enhances enterprises' willingness to pay, because the monthly subscription fee for a basic enterprise - version chatbot can be recouped by an employee saving less than one day's work time. As for distinguishing hype from real signals, we see that the revenues of cloud service providers such as AWS, Azure, and GCP are all growing at an accelerating rate. GCP even achieved a 48% year - on - year growth, which proves that the actual demand for computing power has created huge new revenue.
Not only technology enablers but also the ultimate beneficiaries are seeing revenue growth. For example, Palantir helps insurance companies like AIG use AI agents to evaluate and underwrite hundreds of thousands of contracts that were previously backlogged due to insufficient manual review capabilities. This application of AI to fill the human - power gap across the entire economy not only reduces high operating costs but also brings real net new revenue and a huge market expansion.
Host: As insurance companies like AIG and other enterprises continue to adopt AI, what will be the biggest bottleneck for AI scaling in the next three years? Is it power, computing power, data quality, regulation, or talent?
Researcher: This is a very good question, and the market has been discussing what the current bottleneck is. I think the ultimate bottleneck boils down to power and computing power.
If OpenAI wants to launch a new product, or Claude Code is scaling up, or Anthropic wants to acquire new users, they all need GPUs, data centers to house the GPUs, and the power to connect them to the power grid. Even xAI is building its own power plant. All these factors need to be combined, and I think this is the main bottleneck, perhaps even more so than data or talent.
From the research trends of the latest models and AI labs, models are increasingly generating their own training data. Although human thinking as seed data is important, it can be infinitely expanded through synthetic data generation. In addition, models are also involved in finding new algorithm breakthroughs to improve their own performance. For example, the latest programming model discussed by OpenAI is the first new model trained with the assistance of previous - generation models. This alleviates the talent bottleneck to some extent. Of course, talent is still very important, which is why there is a large amount of talent flow among the four core AI labs.
But I still rank computing power first, provided that you have a data center and sufficient power to start these chips. In addition, trade - offs can be made when encountering bottlenecks. People used to say that the data was running out, but the chain - of - thought concept makes us realize that in fact, we can use additional computing power to generate new data based on existing data. If you encounter a bottleneck in one area, you can consume another resource to improve AI capabilities.