PayPal
BitCoin
Facebook
Twitter
Amazon
RSS
iTunes

DoseNation Podcast

Weekly news, talk, and interviews. More »


home

  SEARCH
  BROWSE
· Articles
· Gear
· Interviews
· News
· Reviews
   ARTICLES : DRUGS : TRP5
Questions of Consciousness

Ryan Hastings

Contrasting approaches for thinking about the mind

Before the latter half of this century, questions of consciousness were best left to philosophy. But science has begun to push into that conversation, which began with self-consciousness and has grown into a bewildering variety of arguments and theories. Chemistry, biology, mathematics, and physics are contributing to an interdisciplinary science of mind. Many mysteries remain, and we have much to learn — some suggest that we will never fully understand human consciousness, but only lower-order, simpler, systems. But we have learned a great deal, and the most modern sciences and the recent revolution of the facts of fractals open the way for scientific and philosophic communities to forward reasonable suggestions for how consciousness works.

The approach which has gained in popularity, and which seems to produce the most plausible models, is the trialogue between neuroscience, psychology, and phenomenology. Neurosciences detail how the brain works. Psychology postulates models for the mind, imagining systems that account for empirical results found through experimentation. Phenomenology is the philosophic pursuit of identifying and describing subjective experience and its contents. These fields are insufficient to tackle the hard problem of consciousness alone, but when the observations and ideas are compared, a powerful program grows. In this article, I'll focus on the work done in the scientific disciplines.

Neuroscience can tell us how the brain works, but one cannot build a theory of mind from a complex network of interconnected cells. The properties of consciousness are emergent from the complexity, and cannot be predicted simply by knowing the physiological nitty-gritty of the wetware in which consciousness instantiates.

Psychology describes mental operations, finds laws governing their behavior. Like anything else in nature, the mind works according to certain natural principles. None of this answers the question, though, of what consciousness has to do with anything. This problem was clearly formulated by Ray Jackendoff in his 1986 work, Consciousness and the Computational Mind. "Mind" can be used in two senses — one can speak of the "phenomenological mind," our world of perceptual and reflective experience, and the "computational mind," the processes by which perceptions, thoughts, and feelings are produced.

The mind/brain problem was multiplied — now there's a mind/mind problem. If that weren't enough, the details of the computational mind make consciousness as useless to cognitive psychology as it was to behaviorist psychology three decades ago. These computations could conceivably be carried out by a machine (for example, an AI) which possesses a computational mind without having any subjective experience. This mind/mind problem can now be addressed with neuroscience.

CONSCIOUSNESS AND THE BRAIN

Researchers used to assume that there was some part of the brain where consciousness resides — where all the sensed world is evaluated and a course of action decided. As far as we know, no such consciousness module exists. There is no part of the brain whose removal would result in the removal of consciousness, but leave all the other mental operations intact. Damage to parts of the brain damage parts of the entirety of conscious experience — consciousness is related to the global state of the brain.

A lesion in a particular part of the brain causes propasagnosia — the inability to recognize faces. No matter how well known, no matter how close emotionally, a propasagnosic is incapable of assigning a name to a picture of a face. Yet, empirical evidence suggests that some sort of facial perception is taking place. For instance, if the patient is shown a picture of a famous person, she can guess the correct name from a list of names statistically better than if she had never seen the face before.

The tempting trap which many psychologists fall into is supposing that the perception is occurring but for some reason not reaching consciousness. This follows from the assumption that complex cognitive procedures can take place without consciousness.

Neuropsychologist Marcel Kinsbourne disputes this. Clearly, the perceptual task is not taking place in the absence of consciousness. Though the results are better than random for name-guessing experiments and similar psychologic sleight-of-hand tricks, they are still quite poor. Consciousness is required for this task to take place. The damaged tissue has not been completely destroyed, a rudimentary degree of facial perception can still occur and exert an unconscious influence, but not enough of an influence that it could be consciously recognized and reported.

Kinsbourne starts with this and presents a hypothesis which differs from previous models for the neural correlate of consciousness (sometimes abbreviated NCC). Many models suggest that consciousness emerges from a particular anatomical or physiologic feature (such as 40 Hz oscillations of activity that spread through the dense feedback loops in the thalamocortical system, synchronizing perceptions from many different sense modalities, an idea we'll pick up again later). Kinsbourne instead looks at what qualities a cluster of neurons firing their messages (such as those responsible for perceiving the motion of leaves in the wind, or an idea for a paper) must have in order to be conscious.

REPRESENTATIONS & CONSCIOUSNESS

The brain is the most complex system of which we know. Even conservative estimates suggest more potential states for the system formed by the connections between nerve cells in the brain than the number of fundamental particles that physicists estimate to exist in the known Universe. Physical energy — heat, light, air vibration, pressure, temperature, or even molecules (of smell and taste) — is transduced into a neurologic code, written in patterns of neural firing that trace fractal shapes in the electrochemical soup. This code is called a "representation," a pattern of activity in a group of nerve cells representing some mental activity. Every thought, every perception, every memory is represented in the nervous system.

Conscious, attentive states involve the brain entering a self-organizing critical (SOC) state. Thunderstorms and rush hour traffic are SOCs. They are a delicate balance of forces where small adjustments can be amplified into major changes in the whole system. Anyone who has spent any time with a cellular automata artifical life program has seen the dynamics of an SOC. (Anyone who hasn't but who is interested in nonlinear dynamics and chaos would do well to type "cellular automata" into a search engine and find some shareware.) Cellular automata present the user with a checkerboard, with each of the squares in either an "on" (or "alive") or "off" (or "dead") state. The program then cycles through, and if a specified number of adjacent squares are "on," then the square will be "on," and if a specified number are "off," then the square will be turned "off." Patterns of growth and evolution crawl across the screen. At first, the behavior of the entire system seems random, but very quickly it can stabilize into patterns organized around mathematical attractors. One can disturb the system by changing a particular square. Some changes don't affect the attractors much, and the system remains stable. But a change in the right place can cause the entire system to reorganize around new attractors. (This reorganization is called a bifurcation.)

This is what an attentive conscious state is like. Certain attractors dominate the patterns of brain activity. These attractors, mathematical fictions that manifest in the neuronal representations, comprise the dominant focus of conscious attention — one can imagine a "fluffy white cloud" attractor for a person watching shapes in the sky. The representation, the perceptual information, rapidly self-transforms according to the perceptual procedures being performed on it until they stabilize into a form that reaches consciousness.

So, then, which representations end up playing a role in consciousness? Kinsbourne suggests any representation which has a sufficient duration, a sufficient level of activation (intensity or complexity), and congruence with the entirety of the mind/brain.

A representation must last in a stable form for long enough to be included in a conscious report. Subliminal perception provides a fine example of this rule — if the stimulus is flashed too briefly, then although the object is perceived, the perception does not last long enough to be noticed or remembered.

A representation must have a sufficient level of activation — a sufficient number of neurons in the cell assembly must be devoted to the representation. This simultaneously suggests a complexity of perception and level of intensity which prevents the representation from being lost in the noise of unnoticed sense-data. Propasagnostics lack this — the cell assemblies devoted to facial perception are damaged. Not enough neurons can participate to merit conscious attention and recollection.

Finally, the representation must be congruent with the mind/brain. Contents which break the continuum of experience (an illusion edited from the discontinuous and disparate events of consciousness) are unconscious and unremembered. Cases of dissociative personality disorders are clear examples of this on a psychologic level — sequences of memory are blocked from conscious recall, or in extreme situations assigned to a new personality altogether. At a neurological level, we see that the representations themselves must be brought into synchrony in order to be conscious. Unsynchronized neural activity is unconscious, even if it subtly affects the global state of the brain.

A brief word on synchrony, or the "binding problem": How is it that we experience simultaneity of different sense modalities? Why is it that we see someone's lips moving and hear their words synched up with the visual, when these perceptual processes take place at different rates? These aren't directly, physiologically linked anywhere in the wetware, so why do they seem linked in experience?

Almost all sensation — sight, sound, taste, interoreceptive information on body states, proprioceptive accounts of body position, most everything but smell — passes through the thalamus. The thalamus is a structure deep in the midbrain. It connects to the cerebral cortex through ascending pathways (carrying impulses up to the cortex for complex perception) and descending pathways (feedback from the cortex). The thalamus serves as a relay station, modulating the sense-data according to the demands of the cortex. Perception emerges from the conversation between the two, in a complex feedback loop.

Anyone whose ears have been split by the audio feedback of a concert knows that feedback systems can become oscillators. The thalamocortical system is no exception. Neural activity in disparate areas (such as visual and audio cortex) fall into a synchronized rhythm, which magneto-encephalographers measure to be 40 Hz.

This rhythm is reset when a novel stimulus is presented. Rudolpho Llinas suggests that this is a refresh rate for the brain — that every 25 milliseconds, we experience a new conscious moment.

THE MIND/MIND PROBLEM

Kinsbourne's neurodynamical notions can be brought back into cognitive science to solve Jackendoff's riddles, and with this I will conclude.

Remember, the computationalist approach in cognitive science presents a problem. The mind is split between the cognitive architectures and operations which process information to give sensations, perceptions, thoughts, decisions, behaviors, and the personal, subjective experience of those products of the processing (while being blind to the operations themselves). This compounds the mind/brain problem into a computational mind/brain problem (how does the cognitive architecture relate to the neural architecture?) and a phenomenological mind/brain problem (how does experience relate to the brain?), and a computational mind/phenomenal mind problem.

This article has taken the embodied or dynamical approach to cognitive science. It examines the mind as embodied in the brain; the cognitive operations are embodied in the neural operations. The synchronization of neural assemblies in the thalamocortical system into stable patterns that represent the contents of consciousness is the leading hypothesis for a solution to the phenomenological mind/brain problem. It is tempting to simply cast aside the more abstract computational approach, then, neuroscience having succeeded where computationalism has apparently failed.

However, in doing this, one runs the risk of tossing out infants with the wash waste. Computational models continue to be useful in examining consciousness. For example, considering mental tasks as though a computer were performing them helps us formulate hypotheses about the mechanisms of human consciousness, leading to new experiments revealing yet more data to be incorporated into a theory of mind. The brain does perform computations on sense-data, and complex cortical functions are still better modeled with the computer metaphor.

So what is this computational mind? It is what drives the transformations which the representation moves through as it becomes the form which will reach consciousness. In our neurodynamical model, this correlates to the mathematics arising from the specific interconnnections in the neuronal ensembles. It is the chaos of the brain state that drives it, and the self-organization of the representation is also a self-computation. So much for the computational mind/brain problem.

And the mind/mind problem? One oft repeated mantra in cognitive neuroscience is, "Mind is something brain does." Subjective experience emerges from the chaotic complexity of the brain. That chaos is a computer, and the phenomenal mind is as emergent as any of the phenomena.

FURTHER READING

Kinsbourne, Marcel. "What Qualifies a Representation for a Role in Consciousness?" Scientific Approaches to Consciousness, Cohen, Jonathan D. and Jonathan Schooler. Lawrence Elrbaum: 1997.

Jackendoff, Ray. Consciousness and the Computational Mind. MIT Press, Cambridge: 1987.

Varela, Francisco J. The Embodied Mind. MIT Press, Cambridge: 1991.

http://www.phil.vt.edu/ASSC/ - Association for the Scientific Study of Consciousness.

http://www.culture.com.au/brain_proj/ - The Brain Project.


Tags : psychedelic
Rating : Teen - Drugs
Posted on: 2002-10-15 00:00:00