Professional Documents
Culture Documents
https://aeon.co/essays/the-hard-problem-of-consciousness-
is-a-distraction-from-the-real-one
3,600 words
12,492Tweet
Make a donation
371Responses
What is the best way to understand consciousness? In philosophy, centuries-old
debates continue to rage over whether the Universe is divided, following Ren
Descartes, into mind stuff and matter stuff. But the rise of modern
neuroscience has seen a more pragmatic approach gain ground: an approach
that is guided by philosophy but doesnt rely on philosophical research to
provide the answers. Its key is to recognise that explaining why consciousness
exists at all is not necessary in order to make progress in revealing its material
basis to start building explanatory bridges from the subjective and
phenomenal to the objective and measurable.
But there is an alternative, which I like to call the real problem: how to account
for the various properties of consciousness in terms of biological mechanisms;
without pretending it doesnt exist (easy problem) and without worrying too
much about explaining its existence in the first place (hard problem). (People
familiar with neurophenomenology will see some similarities with this way of
putting things but there are differences too, as we will see.)
There are some historical parallels for this approach, for example in the study of
life. Once, biochemists doubted that biological mechanisms could ever explain
the property of being alive. Today, although our understanding remains
incomplete, this initial sense of mystery has largely dissolved. Biologists have
simply gotten on with the business of explaining the various properties of living
systems in terms of underlying mechanisms: metabolism, homeostasis,
reproduction and so on. An important lesson here is that life is not one thing
rather, it has many potentially separable aspects.
Daily Weekly
Subscribe
W hat are the fundamental brain mechanisms that underlie our ability to be
conscious at all? Importantly, conscious level is not the same as wakefulness.
When you dream, you have conscious experiences even though youre asleep.
And in some pathological cases, such as the vegetative state (sometimes called
wakeful unawareness), you can be altogether without consciousness, but still
go through cycles of sleep and waking.
So what underlies being conscious specifically, as opposed to just being awake?
We know its not just the number of neurons involved. The cerebellum (the so-
called little brain hanging off the back of the cortex) has about four times as
many neurons as the rest of the brain, but seems barely involved in maintaining
conscious level. Its not even the overall level of neural activity your brain is
almost as active during dreamless sleep as it is during conscious wakefulness.
Rather, consciousness seems to depend on how different parts of the brain
speak to each other, in specific ways.
But what is the quality that brain-complexity measures are measuring? This is
where new theoretical ideas about consciousness come into play. These start in
the late 1990s, when Gerald Edelman (my former mentor at the Neurosciences
Institute in San Diego) and Giulio Tononi now at the University of Wisconsin
in Madison argued that conscious experiences were unique in being
simultaneously highly informative and highly integrated.
The maths that captures the co-existence of information and integration maps
onto the emerging measures of brain complexity
It turns out that the maths that captures this co-existence of information and
integration maps onto the emerging measures of brain complexity I described
above. This is no accident it is an application of the real problem strategy.
Were taking a description of consciousness at the level of subjective experience,
and mapping it to objective descriptions of brain mechanisms.
Some researchers take these ideas much further, to grapple with the hard
problem itself. Tononi, who pioneered this approach, argues thatconsciousness
simply is integrated information. This is an intriguing and powerful proposal,
but it comes at the cost of admitting that consciousness could be present
everywhere and in everything, a philosophical view known as panpsychism. The
additional mathematical contortions needed also meanthat, in practice,
integrated information becomes impossible to measure for any real complex
system. This is an instructive example of how targeting the hard problem, rather
than the real problem, can slow down or even stop experimental progress.
Experiments such as these have identified brain regions that are consistently
associated with conscious perception, independently of whether that perception
is visual, auditory or in some other sensory modality. The most recent chapter in
this story involves experiments that try to distinguish between those brain
regions involved in reporting about a conscious percept (eg, saying: I see a
face!) from those involved in generating the conscious percept itself. But as
powerful as these experiments are, they do not really address the real problem
of consciousness. To say that a posterior cortical hot-spot (for instance) is
reliably activated during conscious perception does not explain why activity in
that region should be associated with consciousness. For this, we need a general
theory of perception that describes what brains do, not just where they do it.
In the 19th century, the German polymath Hermann von Helmholtz proposed
that the brain is a prediction machine, and that what we see, hear and feel are
nothing more than the brains best guesses about the causes of its sensory
inputs. Think of it like this. The brain is locked inside a bony skull. All it receives
are ambiguous and noisy sensory signals that are only indirectly related to
objects in the world. Perception must therefore be a process of inference, in
which indeterminate sensory signals are combined with prior expectations or
beliefs about the way the world is, to form the brains optimal hypotheses of the
causes of these sensory signals of coffee cups, computers and clouds. What we
see is the brains best guess of whats out there.
Its easy to find examples of predictive perception both in the lab and in
everyday life. Walking out on a foggy morning, if we expect to meet a friend at a
bus stop, we might perceive her to be there, until closer inspection reveals a
stranger. We can also hear words in nonsensical streams of noise, if we are
expecting these words (play Stairway to Heaven backwards and you can hear
satanic poetry). Even very basic elements of perception are shaped by
unconscious beliefs encoded in our visual systems. Our brains have evolved to
assume (believe) that light comes from above, which influences the way we
perceive shapes in shadow.
People consciously see what they expect, rather than what violates their
expectations
The classical view of perception is that the brain processes sensory information
in a bottom-up or outside-in direction: sensory signals enter through receptors
(for example, the retina) and then progress deeper into the brain, with each
stage recruiting increasingly sophisticated and abstract processing. In this view,
the perceptual heavy-lifting is done by these bottom-up connections. The
Helmholtzian view inverts this framework, proposing that signals flowing into
the brain from the outside world convey only prediction errors the differences
between what the brain expects and what it receives. Perceptual content is
carried by perceptual predictions flowing in the opposite (top-down) direction,
from deep inside the brain out towards the sensory surfaces. Perception involves
the minimisation of prediction error simultaneously across many levels of
processing within the brains sensory systems, by continuously updating the
brains predictions. In this view, which is often called predictive coding or
predictive processing, perception is a controlled hallucination, in which the
brains hypotheses are continually reined in by sensory signals arriving from the
world and the body. A fantasy that coincides with reality, as the psychologist
Chris Frith eloquently put it in Making Up the Mind (2007).
Armed with this theory of perception, we can return to consciousness. Now,
instead of asking which brain regions correlate with conscious (versus
unconscious) perception, we can ask: which aspects of predictive perception go
along with consciousness? A number of experiments are now indicating that
consciousness depends more on perceptual predictions, than on prediction
errors. In 2001, Alvaro Pascual-Leone and Vincent Walsh at Harvard Medical
School asked people to report the perceived direction of movement of clouds of
drifting dots (so-called random dot kinematograms). They used TMS to
specifically interrupt top-down signalling across the visual cortex, and they
found that this abolished conscious perception of the motion, even though
bottom-up signals were left intact.
O f the many distinctive experiences within our inner universes, one is very
special. This is the experience of being you. Its tempting to take experiences of
selfhood for granted, since they always seem to be present, and we usually feel a
sense of continuity in our subjective existence (except, of course, when emerging
from general anaesthesia). But just as consciousness is not just one thing,
conscious selfhood is also best understood as a complex construction generated
by the brain.
There is the bodily self, which is the experience of being a body and of having a
particular body. There is the perspectival self, which is the experience of
perceiving the world from a particular first-person point of view.
The volitional self involves experiences of intention and of agency of urges to
do this or that, and of being the causes of things that happen. At higher levels,
we encounter narrative and social selves. The narrative self is where the I
comes in, as the experience of being a continuous and distinctive person over
time, built from a rich set of autobiographical memories. And the socialself is
that aspect of self-experience that is refracted through the perceived minds of
others, shaped by our unique social milieu.
Lets take the example of bodily selfhood. In the famous rubber-hand illusion, I
ask you to focus your attention on a fake hand while your real hand is kept out
of sight. If I then simultaneously stroke your real hand and the fake hand with a
soft paintbrush, you may develop the uncanny feeling that the fake hand is now,
somehow, part of your body. This reveals a surprising flexibility in how we
experience owning our bodies and raises a question: how does the brain decide
which parts of the world are its body, and which arent?
To answer this, we can appeal to the same process that underlies other forms of
perception. The brain makes its best guess, based on its prior beliefs or
expectations, and the available sensory data. In this case, the relevant sensory
data include signals specific to the body, as well as the classic senses such as
vision and touch. These bodily senses include proprioception, which signals the
bodys configuration in space, and interoception, which involves a raft of inputs
that convey information from inside the body, such as blood pressure, gastric
tension, heartbeat and so on. The experience of embodied selfhood depends on
predictions about body-related causes of sensory signals across interoceptive
and proprioceptive channels, as well as across the classic senses. Our
experiences of being and having a body are controlled hallucinations of a very
distinctive kind.
Research in our lab is supporting this idea. In one experiment, we used so-called
augmented reality to develop a new version of the rubber-hand illusion,
designed to examine the effects of interoceptive signals on body ownership.
Participants viewed their surroundings through a head-mounted display,
focusing on a virtual reality version of their hand, which appeared in front of
them. This virtual hand was programmed to flash gently red, either in time or
out of time with their heartbeat. We predicted that people would experience a
greater sense of identity with the virtual hand when it was pulsing
synchronously with their heartbeat, and this is just what we found. Other
laboratories are finding that similar principles apply to other aspects of
conscious self. For example, we experience agency over events when incoming
sensory data match the predicted consequences of actions and breakdowns in
experienced agency, which can happen in conditions such as schizophrenia
can be traced to abnormalities in this predictive process.
These findings take us all the way back to Descartes. Instead of I think therefore
I am we can say: I predict (myself) therefore I am. The specific experience of
being you (or me) is nothing more than the brains best guess of the causes of
self-related sensory signals.
There is a final twist to this story. Predictive models are good not only for
figuring out the causes of sensory signals, they also allow the brain to control or
regulate these causes, by changing sensory data to conform to existing
predictions (this is sometimes called active inference). When it comes to the
self, especially its deeply embodied aspects, effective regulation is arguablymore
important than accurate perception. As long as our heartbeat, blood pressure
and other physiological quantities remain within viable bounds, it might not
matter if we lack detailed perceptual representations. This might have
something to do with the distinctive character of experiences of being a body,
in comparison with experiences of objects in the world or of the body as an
object.
And this returns us one last time to Descartes. In dissociating mind from body,
he argued that non-human animals were nothing more than beast machines
without any inner universe. In his view, basic processes of physiological
regulation had little or nothing to do with mind or consciousness. Ive come to
think the opposite. It now seems to me that fundamental aspects of our
experiences of conscious selfhood might depend on control-oriented predictive
perception of our messy physiology, of our animal blood and guts. We are
conscious selves because we too are beast machines self-sustaining flesh-bags
that care about their own persistence.