...
The mind–body problem resolved? Integrated information theory posits that any one conscious experience, here that of looking at a Bernese mountain dog, is identical to a maximally irreducible cause-effect structure. Its physical substrate, its Whole, is the operationally defined neural correlate of consciousness. The experience is formed by the Whole but is not identical to it.
Consider the image above, in which I’m looking at my dog Ruby and have a particular visual experience, a maximally irreducible cause-effect structure. It is constituted by the underlying physical substrate, the Whole, here a particular neural correlate of consciousness within the hot zone in my posterior cortex. But the experience is not identical to the Whole. My experience is not my brain.
This Whole has definite borders; a particular neuron is either part of it or not. The latter is true even if this neuron provides some synaptic input to the Whole. What defines the Whole is a maximum of integrated information, with the maximum being evaluated over all spatiotemporal scales and levels of granularities, such as molecules, proteins, subcellular organelles, single neurons, large ensembles of them, the environment the brain interacts with, and so on.
It is the irreducible Whole that forms my conscious experience, not the underlying neurons. So not only is my experience not my brain, but most certainly it is not my individual neurons. While a handful of cultured neurons in a dish may have an itsy-bitsy amount of experience, forming a mini-mind, the hundreds of millions neurons making up my posterior cortex do not embody a collection of millions of mini-minds. There is only one mind, my mind, constituted by the Whole in my brain.
Other Wholes may exist in my brain, or my body, as long as they don’t share elements with the posterior hot zone Whole. Thus, it may feel like something to be my liver, but given the very limited interactions among liver cells, I doubt it feels like a lot.
The exclusion principle also explains why consciousness ceases during slow sleep. At this time, delta waves dominate the EEG and cortical neurons have regular hyperpolarized down-states during which they are silent, interspersed by active up-states when neurons are more depolarized. These on- and off-periods are regionally coordinated. As a consequence, the cortical Whole breaks down, shattering into small cliques of interacting neurons. Each one probably has only a whit of integrated information. Effectively, “my” consciousness vanishes in deep sleep, replaced by myriad of tiny Wholes, none of which is remembered upon awakening.
The exclusion postulate also dictates whether or not an aggregate of conscious entities — ants in a colony, cells making up a tree, bees in a hive, starlings in a murmurating flock, an octopus with its eight semiautonomous arms, or the hundreds of Chinese dancers and musicians during the choreographed opening ceremony of the 2008 Olympic games in Beijing — exist as conscious entities. A herd of buffalo during a stampede or a crowd can act as if it had “one mind,” but this remains a mere figure of speech unless there is a phenomenal entity that feels like something above and beyond the experiences of the individuals making up the group. Per IIT, this would require the extinction of the individual Wholes, as the integrated information for each of them is less than the Φmax of the Whole. Everybody in the crowd would give up his or her individual consciousness to the mind of the group, like being assimilated into the hive mind of the Borg in the “Star Trek” universe.
Androids, if their physical circuitry is anything like today’s CPUs, cannot dream of electric sheep.
IIT’s exclusion postulate does not permit the simultaneous existence of both individual and group mind. Thus, the Anima Mundi or world soul is ruled out, as it requires that the mind of all sentient beings be extinguished in favor of the all-encompassing soul. Likewise, it does not feel like anything to be the three hundred million citizens of the United States of America. As an entity, the United States has considerable extrinsic causal powers, such as the power to execute its citizens or start a war. But the country does not have maximally irreducible intrinsic cause-effect power. Countries, corporations, and other group agents exist as powerful military, economic, financial, legal, and cultural entities. They are aggregates but not Wholes. They have no phenomenal reality and no intrinsic causal power.
Thus, per IIT, single cells may have some intrinsic existence, but this does not necessarily hold for the microbiome or trees. Animals and people exist for themselves, but herds and crowds do not. Maybe even atoms exist for themselves, but certainly not spoons, chairs, dunes, or the universe at large.
IIT posits two sides to every Whole: an exterior aspect, known to the world and interacting with other objects, including other Wholes; and an interior aspect, what it feels like, its experience. It is a solitary existence, with no direct windows into the interior of other Wholes. Two or more Wholes can fuse to give rise to a larger Whole but at the cost of losing their previous identity.
Finally, panpsychism has nothing intelligible to say about consciousness in machines. But IIT does. Conventional digital computers, built out of circuit components with sparse connectivity and little overlap among their inputs and their outputs, do not constitute a Whole. Computers have only a tiny amount of highly fragmented intrinsic cause-effect power, no matter what software they are executing and no matter their computational power. Androids, if their physical circuitry is anything like today’s CPUs, cannot dream of electric sheep. It is, of course, possible to build computing machinery that closely mimics neuronal architectures. Such neuromorphic engineering artifacts could have lots of integrated information. But we are far from those.
IIT can be thought of as an extension of physics to the central fact of our lives — consciousness. Textbook physics deals with the interaction of objects with each other, dictated by extrinsic causal powers. My and your experiences are the way brains with irreducible intrinsic causal powers feel like from the inside.
IIT offers a principled, coherent, testable, and elegant account of the relationship between these two seemingly disparate domains of existence — the physical and the mental — grounded in extrinsic and intrinsic causal powers. Causal power of two different kinds is the only sort of stuff needed to explain everything in the universe. These powers constitute ultimate reality.
Further experimental work will be essential to validate, modify, or perhaps even reject these views. If history is any guide, future discoveries in laboratories and clinics, or perhaps off-planet, will surprise us.
We have come to the end of our voyage. Illuminated by the light of our pole star — consciousness — the universe reveals itself to be an orderly place. It is far more enminded than modernity, blinded by its technological supremacy over the natural world, takes it to be. It is a view more in line with earlier traditions that respected and feared the natural world.
Experience is in unexpected places, including in all animals, large and small, and perhaps even in brute matter itself. But consciousness is not in digital computers running software, even when they speak in tongues. Ever-more powerful machines will trade in fake consciousness, which will, perhaps, fool most. But precisely because of the looming confrontation between natural, evolved and artificial, engineered intelligence, it is absolutely essential to assert the central role of feeling to a lived life.
Christof Koch is Chief Scientist of both the MindScope Program at the Allen Institute for Brain Science and The Tiny Blue Dot Foundation, following 27 years as a Professor at the California Institute of Technology. He is the author of several books, including “Consciousness: Confessions of a Romantic Reductionist” and “The Feeling of Life Itself Why Consciousness Is Widespread but Can’t Be Computed,” from which this article is adapted.
|