Does consciousness come from "living computation"?
As artificial intelligence becomes increasingly intelligent, an old question has once again come to the forefront: If a program is complex enough, can a machine truly develop consciousness? A recently published review paper in Neuroscience and Biobehavioral Reviews offers a thought - provoking answer - consciousness is not a piece of code running on hardware but stems from a special kind of "computing matter."
For a long time, discussions about consciousness have often fallen into two opposing stances. One view holds that the mind is like software. As long as the information - processing method is correct, consciousness can be generated whether it runs on neurons or silicon chips. The other view insists that consciousness cannot be separated from the unique structure and life processes of the biological brain; change the material, and it is no longer consciousness. This new study suggests that this either - or debate itself may be a misunderstanding.
The researchers proposed a new concept called "biological computationalism." They do not deny that the brain performs computations, but they emphasize that this type of computation is completely different from the digital computation we are familiar with. The brain does not move symbols around step by step like a computer; instead, it embeds computation deeply in its physical structure, energy consumption, and continuously changing dynamic processes. In other words, the brain is not running a program; it is the computational process itself.
The paper points out that biological computation has three key characteristics. First, the brain's computation is hybrid, including both discrete events like neuron firing, similar to switches, and continuously changing processes such as electric fields, chemical gradients, and oscillatory activities. These two processes are intertwined and shape each other, and cannot be viewed separately. Second, this computation cannot be segmented by scale. From ion channels and dendrites to neural circuits and the activity of the entire brain, different levels always influence each other, without a clear "algorithm layer" and "implementation layer." Changing the physical structure will itself change the way of computation. Third, the brain's computation is deeply shaped by energy limitations. The brain operates with extremely low energy consumption, and it is precisely this limitation that forces it to develop a form of intelligence that is collaborative across scales, stable, and flexible.
For this reason, the researchers proposed a counter - intuitive conclusion: In the brain, the algorithm is the carrier. The physical structure is not the "shell" of computation but the computation itself. Consciousness emerges from this special computational matter, rather than "jumping" out of an abstract piece of code.
This view also sets new boundaries for understanding artificial intelligence. Current AI systems, no matter how powerful, are essentially simulating functions on traditional hardware. They can approximate certain behavioral results, but the computation and physical implementation are separated, time is updated step - by - step, and energy hardly constitutes an internal constraint. In contrast, the brain's computation occurs in real physical time, driven by continuous fields and discrete events, which are the key mechanisms supporting the integration of consciousness and continuous experience.
The researchers also specifically emphasize that this does not mean "only biological beings can have consciousness." They do not claim that consciousness must rely on carbon - based life but point out that if consciousness truly depends on this biological - style computation, then future artificial consciousness may require a brand - new physical system, rather than just more complex code. The key is not whether the material is biological but whether it has a computational structure that combines continuity and discreteness, is closely coupled across scales, and is constrained by energy. From this perspective, asking "what algorithm should a machine run to have consciousness" may be the wrong question. The truly important question may be: What kind of physical system can make computation inseparable from its own dynamics? Only when computation is no longer an abstract description attached to hardware but becomes an existing way of the system itself can consciousness possibly emerge.
Reference: Milinkovic, B., & Aru, J. (2025). On biological and artificial consciousness: A case for biological computationalism. Neuroscience & Biobehavioral Reviews, 106524.
This article is from the WeChat official account "Neural Reality" (ID: neureality). The author is NR NR. It is published by 36Kr with permission.