Cathie Wood: Agents Rewrite the Money-Making Logic, with Funds Flowing to Three Directions First
Money never favors those who are merely smarter.
It only believes in one thing: who is more capable.
Recently, a "lobster" has been all over Silicon Valley. The open - source intelligent agent MoltBot (renamed from Clawdbot due to trademark issues) has become extremely popular. It can not only chat but also truly deliver tasks.
Meanwhile, Cathie Wood, the founder of ARK Invest, clearly pointed out in the latest annual report that the AI inference cost will decrease by 70% - 99% annually. The global GDP annual growth rate may exceed 7%, and inflation may even remain below 1% in the long term.
Considering these two things together: on one hand, intelligent agents are being widely implemented; on the other hand, the inference cost is plummeting annually. This means that AI is no longer just a demonstration tool but a labor force that can be deployed on a large scale.
When intelligent agents become the labor force, the logic of making money changes. Money starts to flow to the side that can deliver results. Cathie Wood has given three directions:
Digital labor force: Intelligent agents like MoltBot are evolving from chat tools to digital employees capable of delivering tasks.
Physical labor force: Robotaxis are transitioning from technological showcases to a closed - loop business model. By 2030, the global market may reach $34 trillion.
Underlying infrastructure: The sharp decline in inference cost, the restart of nuclear energy, and the emergence of space data centers are making AI an infrastructure as common as water and electricity.
This is not the future; this is the present.
Section 1 | Behind MoltBot: The Digital Labor Market Has Opened
MoltBot has become popular, but it is not a product of a large - scale company. Instead, it is a small project that emerged spontaneously in the open - source community.
Within just a few days, this "lobster" has spread rapidly in the Silicon Valley circle. It is used to handle tasks, categorize notes, set up wake - up reminders, and manage emails. Cathie Wood describes it as an AI intern that helps you organize your life even while you sleep.
However, it is not just an interesting tool. Her research team points out that such intelligent agents are no longer chatbots but are starting to be deployed like a labor force. They can view information, make decisions, and execute tasks. In Cathie Wood's words, MoltBot is not just having a conversation; it is delivering results.
Behind this, a new market is taking shape.
A few years ago, AI was just a question - answering tool: you asked a question and waited for an answer. But after projects like MoltBot and GPTAgent emerged, the most obvious change is that people no longer just ask questions but directly assign work.
For example, when you open MoltBot and simply say, "Extract all the meeting invitations from my text messages, emails, and memos in the past three days and make a schedule for me," you can get a Word document, a calendar file, and a summary report.
This "task - to - completion" interaction mode allows AI to officially take over the work that previously required a human assistant.
Cathie Wood believes that this is the starting point of the digital labor force.
Some may ask, "Hasn't this existed for a long time? Why did it suddenly explode in 2026?"
The answer lies in the fact that several things have come together to make intelligent agents go from being usable to being useful.
The capabilities of AI have soared.
- Its processing speed has decreased from 30 seconds to 3 seconds;
- The amount of data it can read has increased from a few pages to hundreds of pages of documents;
- It has evolved from requiring step - by - step guidance to being able to execute a whole set of processes with just one instruction.
More importantly, these systems are now open - source and localized enough.
The emergence of MoltBot allows developers to install an intelligent agent directly on their personal computers without relying on the cloud or logging in to an account. From companies to individuals, anyone can deploy their own AI interns.
Cathie Wood mentioned the changes at ARK in an interview. After their chief AI analyst started using MoltBot, work efficiency significantly improved. The same is true for the entire team: "We didn't hire more people, but our delivery speed has doubled."
In the past, one person could only do one person's work.
Now, one person can be paired with several intelligent agents, with each agent specializing in a specific type of task: collecting data, writing first drafts, understanding policy changes, generating presentation slides, and organizing meeting minutes.
When an intelligent agent can do the work of an intern, the real question is not whether AI can work but how to price these digital labor forces.
This is what Cathie Wood calls the digital labor market.
Section 2 | Robotaxi: From Technological Narrative to Closed - Loop Business
Intelligent agents are invisible digital labor forces, but there is another type of labor force that is already on the road.
In the past few years, people's perception of autonomous driving has been limited to burning money, being a long - way - off dream, and still being in the testing phase. However, in Cathie Wood's view, Robotaxis are transitioning from a technological narrative to a source of revenue. They can not only operate but also generate cash flow.
She clearly stated in this conversation:
"Robotaxi is the most promising business model for us in the next five years. Its market size is not only large but can also be clearly quantified."
1. Clear profit - making path: Fewer vehicles, higher efficiency
The utilization rate of a Robotaxi can reach 50 - 60%, while that of a private car is only 4 - 5%. This means that fewer vehicles can serve more demand.
She mentioned a statistic: Uber currently covers only 1% of the urban mileage in the United States with 140,000 vehicles. To cover 100% of the urban mileage, only 24 million Robotaxis are needed. This is less than one - tenth of the current number of cars in the United States (400 million).
The key is that without driver costs, with automatic vehicle recharging and AI self - scheduling, the marginal cost is extremely low. In the past, one driver was paired with one vehicle. Now, one scheduling backend can manage all the vehicles in a city.
Robotaxis are not just new means of transportation. They are a stable and scalable automated operation model, bound with electricity, AI chips, and road - right scheduling, and have become part of the urban infrastructure, capable of generating revenue by fulfilling real orders.
This means that Robotaxi is a replicable profit - making model, not just a technological exhibit. In Cathie Wood's investment framework, such projects can, like Tesla in its early days, both raise external funds through storytelling and generate self - sustaining cash flow.
2. Where will the money flow?
In the past, when discussing Robotaxis, the focus was on whether the vehicles could operate and whether the regulations were in place. Now, Cathie Wood is more concerned about another question: Who can share the real revenue from the large - scale operation of Robotaxis?
She is optimistic about four types of players:
Scheduling platform providers, companies that can connect multiple Robotaxi brands, plan city - level routes, and match orders;
Energy service providers, those that have the ability to manage vehicle charging at night, route transfers, and grid balance;
Chip and computing power infrastructure providers, the underlying services that provide stable processing capabilities for inference and route planning;
Cities and real - estate developers. After the cost of Robotaxis decreases, previously remote areas may become new value depressions.
This is no longer just a single - point technological innovation but a reconstruction of an economic chain, from transportation to urban layout, from energy scheduling to consumption habits.
She said that when vehicles start to operate autonomously, the tax structure of cities will also change.
3. Technology is not the problem; the conditions are maturing
Cathie Wood clearly stated that the technological preparation for Robotaxis has actually been completed for a long time. What has been holding it back are the regulatory pace, the lack of infrastructure, and the high AI operation cost.
However, now, these three conditions are being simultaneously overcome:
- In terms of regulation: Multiple cities have approved the night - time operation of driverless vehicles;
- In terms of infrastructure: Many places are building dedicated charging stations and waiting areas for Robotaxis;
- In terms of cost: The per - kilometer operation cost of AI is already lower than that of human drivers.
Many car manufacturers have been losing money on creating concepts in the past decade. Now, it has finally come to the point where they can turn Robotaxis into a profit - making outlet.
Cathie Wood particularly emphasizes Tesla's advantage. Tesla's cost structure is 50% lower than that of Waymo. It can set the price at 20 cents per mile, while Uber currently charges $2.8 per mile. According to this trend, by 2030, the global Robotaxi ecosystem may reach $34 trillion.
MoltBot is a digital labor force, and Robotaxi is a physical labor force.
However, their large - scale operation depends on the same thing: the cost of running AI must be low enough.
Section 3 | The Sharp Decline in Inference Cost: AI Is Becoming Infrastructure
MoltBot and Robotaxis can be put into use because a certain cost is plummeting.
This cost refers to the inference cost. Cathie Wood repeatedly emphasized in the interview that the inference cost is decreasing by 70 - 99% annually. This change allows all business models to be recalculated.
1. What is the inference cost?
It is not the money spent on training an entire large - scale model but the actual electricity, chip, and memory resources consumed every time you use AI to do something.
For example, when you input a paragraph and ask Claude/GPT to write a plan for you, the resources consumed by the server in the background to call the model and generate a response at that moment are the inference cost.
In the past, it might cost a few cents per use, but now it may only cost a few millicents. It's not that the model has become cheaper, but using the model to do things has become more affordable.
This change has little impact on ordinary users but is crucial for corporate decision - making.
When the inference cost is high, AI is a demonstration tool; when the cost is low enough for large - scale deployment, it becomes a labor force. This round of AI growth is not driven by increasing the number of people or raising prices but by cost deflation.
2. Deflation has changed the competition logic
When the inference cost becomes as cheap as water and electricity, companies no longer ask whether AI can do a task but instead ask: How much does it cost me each time it runs? How many tasks can I let AI do for me?
At this time, the business models of all AI tools, platforms, and service providers need to be recalculated.
Model companies like Claude and ChatGPT will be asked: How many people can your Agent save if it runs 10 times?
Power platforms will be asked: Can you reduce my electricity cost during peak hours?
AI chip platforms need to answer: Can you double the amount of data processed per second without increasing the electricity bill?
Capital no longer focuses on whose model is larger but on who can reduce the cost of each operation and deliver services more stably.
3. Electricity and computing power have become the keys to victory
Behind this round of cost reduction, the infrastructure is being upgraded at an accelerated pace.
Cathie Wood specifically mentioned three directions:
The restart of nuclear energy. She pointed out that if nuclear energy had not been over - regulated in the 1970s, today's electricity prices could be 40% lower. Now, several states in the United States are restarting nuclear power plants, and China is building 28 large - scale nuclear reactors at once. Cheaper electricity means lower AI operation costs.
Space data centers. SpaceX's reusable rockets have made orbital data centers possible. The efficiency of solar energy in space is six times that on the ground. This will significantly reduce the electricity and cooling costs of data centers, making AI inference more economical.
Distributed energy systems. She mentioned the problem of grid efficiency: there is less electricity consumption at night, and during the day, there is excessive use depending on the weather. In the future, Robotaxis themselves can be mobile energy - storage devices, balancing the grid load and making the energy utilization rate of the entire system higher.
These changes, when combined, allow AI to be accessed, priced, and used on a large - scale like infrastructure.
For every percentage point decrease in the inference cost, a new batch of applications can be implemented.
When the cost goes down, the money will flow there.
Conclusion | The answer to where the money will flow is already clear
AI intelligent agents are not something for the future but are already at work.
Cathie Wood doesn't talk about how powerful the models are. She focuses entirely on the delivery ability:
Digital assistants like MoltBot can double a person's output;
Physical labor forces like Robotaxis may reach a market size of $34 trillion by 2030;
The annual 70 - 99% decline in inference cost is making AI as common as water and electricity.
Money has already started to flow to the side that can deliver results.
It's not about who is smarter but about who is more capable.
Reference materials:
https://www.youtube.com/watch?v=VoW13oUTTV0&pp=2AatDA%3D%3D
https://www.youtube.com/watch?v=VTKPbxhP8jE
https://www.youtube.com/watch?v=wTKHzbLQfyc
Source: Official media/Online news
This article is from the WeChat official account “AI Deep Researcher”. Author: AI Deep Researcher. Editor: Shen Si. Republished by 36Kr with permission.