Academic Tutoring: Consciousness & Philosophy

Posted by Enoch Lambert on 7/8/13 9:44 AM

After being ignored or swept under the rug by scientists and philosophers alike for decades, consciousness has come into central focus over the last 20 years. Biologists, neuroscientists, psychologists, and philosophers have all weighed in with books and book reviews proffering or denying new theories attempting to explain consciousness. 

philosophy tutor

But what is this phenomenon, consciousness, that everyone is seeking to explain?  Although people often invoke the term and make assertions with it as though it is perfectly clear what is meant, it turns out that there are several different notions of consciousness--some more philosophically problematic than others.

One of the most basic and widespread phenomena to which the term is applied is simply wakefulness in contrast to sleep.  Thus, there is a sense of consciousness for which any being that can be said to be awake can thereby be said to be conscious. Among the wakeful, though, there are further distinctions to be made.

Besides ascribing consciousness to organisms as a whole, it is commonplace to distinguish between particular mental states that are conscious or not.  While the distinction goes back, in some form, at least to the great philosopher Leibniz, it is now central to scientific psychology.  Popularly, the distinction between the conscious and subconscious is associated with Freud who gave us a picture of the subconscious as teeming with drives and desires that motivate us in ways we are largely unaware of or cause us to behave in ways for which we subsequently construct post-hoc rationalizations. 

Sublimated libidos aside, there is a sense in which psychology today holds that most of our mental states are subconscious.  No being with conscious visual imagery, for instance, is thought to be conscious of the many computations that go into turning triggered receptors into perception of depth, color, and motion.  Nor are humans thought to be conscious of the grammatical rules operative in first-language acquisition.  And there is a forty year old research program in psychology that has been highly successful in uncovering heuristics and rules of thumb in everyday reasoning that people rely on without being aware that they do so. 

So what distinguishes the conscious mental states from the subconscious ones? 

One feels intuitively compelled to something like the following thought: “well, at the very least, the conscious ones are the ones that I am aware of”.  But this cannot be complete as it is, for it appeals to consciousness (in the form of “awareness”) to tell us which of the mental states are states of consciousness!  Does this mean that consciousness is not analyzable or definable in simpler terms?  Not necessarily. In fact, there is a clever theory, the “higher-order thought” theory of consciousness which attempts to capture what is right about that intuitive thought without the circularity.  In basic outline, what distinguishes an unconscious mental state from a conscious one, is a further mental state, itself possibly subconscious: the thought that one is now undergoing the mental state.  Let me elaborate.

Mental states are often said to have contents.  For example, it is contents that distinguish the many different beliefs that a person has.  My belief that Obama is currently President of the U.S. and my belief that I live in Massachusetts differ in what each is about, that is, their contents.  Notice that neither of the ones just mentioned have contents that are themselves about beliefs.  But beliefs could certainly have contents that are about beliefs or other mental states. 

For instance, someone else might believe of me, Enoch Lambert (a philosophy tutor and academic tutor), that I believe Obama is President. Now, according to the higher-order thought theory of consciousness, if I believed of myself that I am currently having the thought that Obama is President, that belief would become conscious for me. 

Call mental states that are not about mental states first-order. Call all others (i.e., all mental states that are about other mental states) higher-order.  The higher-order theory of consciousness says that higher-order thoughts can make first-order mental states conscious without themselves being conscious.  What is left to explain for the higher-order thought theorist, then, is not consciousness as such. 

The problem of consciousness for them is reduced to more general problems:

1) how can mental states have contents at all?

2) how can mental states have contents that are themselves about mental states? 

The theory of mental content is one that everyone has to deal with and there are numerous offerings which this post will not deal with.  

One worry about the higher-order thought theory of consciousness is that it seems to be committed to denying consciousness to beings we think have consciousness. 

Anyone who has had a pet rodent such as a gerbil or guinea pig or rat growing up can hardly deny that they sometimes go through pain.  Their squeals, avoidance behavior, and looks of terror are sufficient to ascribe them pain. And pain is a conscious mental state if anything is.  But it is highly unlikely that rodents have the conceptual sophistication necessary to have thoughts about their own mental states.  If they do not, then higher-order theories of consciousness must either deny that such rodents have pain or that their pains are conscious.  If you think either denial is highly implausible, then you shouldn’t accept the higher-order thought theory--at least not as a complete theory of consciousness.

Well what is it about pain, exactly, that presents a problem for higher-order theories of consciousness? 

According to many philosophers it is the “phenomenal” or “qualitative” character of pain.  Think about what it is like to feel the sensation of a sharp, stinging pain (perhaps the result of a pinch or bee sting). Now think about what it is like to feel the sensation of a dull yet persistent headache.  Now think about what it is like to experience a rush of euphoria (perhaps while listening to your favorite band perform one of your favorite songs).  The differences are said to be qualitative differences in conscious phenomenology.  And it seems to many to be wildly implausible that such differences are ones that could be created out of whole cloth by thought alone, anymore than thought alone can make it true that Obama is not currently President.  Yet, this is what it seems that any higher-order thought theory of consciousness must be committed to.

The problems raised by phenomenal consciousness are widely thought to generalize far beyond higher-order thought theories, however.  For while neuroscience advances in localizing brain areas where different kinds of phenomenal experiences are generally correlated, one wonders why such areas could give rise to some phenomenal experience rather than another, or to any phenomenal experience at all.  Remember, the cells that we find in brains (e.g., neurons) are very similar in many respects to cells everywhere else in the body.  And the differences that there are aren’t ones that anyone has any reason to believe could suddenly bestow the ability for qualitative phenomenology where there wasn’t any previously. Nor does anyone have any idea how the differences could be explanatorily mapped onto differences in phenomenology like those mentioned above. 

It might be argued that the brain, as a whole organ, is sufficiently different from any other organ in the body that it gives us a foothold for thinking about how it might produce something as different as qualitative phenomenology.  This is too quick, though.  We do have reasons to think that the brain is specialized to do certain things unlike any other organ: process large amounts of sensory information, make “decisions”, and issue motor instructions.  But we also know that massive amounts of similar kinds of information processing is possible in the absence (for which we take ourselves to be sure) of phenomenal consciousness: witness modern-day computers.  We have no special reason, then, to predict that the physiological specialties of brains would give rise to qualitative phenomenological states.  In fact, we have no special reason to predict that any sort of physical stuff we currently know about should give rise to phenomenal consciousness rather than not (or rather than some other sort of physical stuff).  The problem this raises has been dubbed by one of the world’s leading philosophers, David Chalmers, the hard problem of consciousness.  How on earth could any brute material stuff give rise to something like phenomenal consciousness?  In contrast, the “easy” problems of consciousness include things like, what part or features of brains are correlated with our qualitative experiences of boredom? Internal monologues? Grief? Awareness of bodily positioning?  Etc.

The “hard” problem of consciousness is the focus of intense philosophical research and debate today.  If you are interested in it, or in consciousness in general, I recommend the entry in the Stanford Encyclopedia of Philosophy as well as getting acquainted with the book that really made the “hard” problem what it is today--David Chalmers’ The Conscious Mind, which is a contemporary philosophical classic that has had a huge impact both in philosophy and in the sciences of the mind.   

Tags: philosophy