Wednesday, August 10, 2016

Qualia, or why do we experience anything at all?

Today I'd like to discuss a topic which has baffled (at least some) philosophers for quite some time and which I am even more ill-qualified to address than usual.  Since I'm giving general impressions from general ignorance I'll be citing a few well-known examples without attribution.  You can find a good summary here, or it least it seemed like a good one to me.  Rest assured I'm not claiming to be doing any original work here, just ... conjecturing.

The term qualia has come to encompass experiences, and in particular subjective experiences.  For example, what is it like to see the color red, or what is it to be a bat.  Such experiences seem to be subjective, in that the experience depends, at least in principle, on who's experiencing it.  To take a very old example, cliche but no less valid for being cliche, I have no obvious way of knowing whether you experience the color red in the way I do.  Perhaps you experience it the way I experience the color blue, and vice versa, or perhaps you experience it some completely different way.

For that matter, how do I know that you experience anything?  If you and I are at an intersection, stopped at a red light, I can see you react to the light turning green, but that doesn't mean that you had the same experience I did of seeing a red light and then a green light.  I assume that you experienced the sensation of something red and then something green, and that the color red seemed essentially the same way to you as it did to me, but how would I know?

Suppose you were actually in a self-driving car browsing the news on your phone.  You didn't see the light at all.  Rather, the car's cameras recorded the light changing and the car's control system caused the car to go when the light turned green.  I'm perfectly comfortable saying "The car saw the light change and drove through the intersection when it turned green", anthropomorphizing the car, but that doesn't mean I think the car experienced the colors red and green in anything like the way you or I would (or at least, I think you would).

Trying to account for distinctions like this in some objective way has been referred to as "the hard problem of consciousness", as opposed to easier, more empirical problems like "How does the brain record memories?" or "To what extent are we conscious of our own decisions?"

In some sense it's quite likely that all experiences are distinct.  If I see a red paint chip today and then again tomorrow, I will almost certainly have different associations each time.  The first time might put me in mind of a stop sign, or blood, or a red apple.  The second time I might be more focused on whether it's the same paint chip I saw yesterday.  Likewise, you will almost certainly have different associations than I will even if we're looking at the same chip.

And yet, we would probably all agree that we are experiencing seeing something red, and that it feels like something to have that experience.   Even if there's no emotional response, you're still having some sort of experience.  How do we account for that?

Suppose we could account for every firing of every neuron in the nervous system (including the optic nerve, which is actually doing quite a bit of processing before the signal even gets to the brain).  Have we accounted for the experience?  Suppose that after decades of research we compile an exhaustive list of experiences and how they correlate to brain activity.  We bring in a new subject and scan their neural activity.  Pointing at a display, we say "That pattern of firing always occurs in response to seeing the color red".  We can say "that person is experiencing the color red", but how, exactly, do we know that for sure?

It's not hard to imagine what kind of data would back this up.  We hook hundreds of subjects from all over the world and all walks of life up to our highly-advanced brain scanner, flash colors at them and note the results.  We may even ask them to describe what they're experiencing.  When we see the same patterns for our new subject it's a reasonable inference that their brain is processing the color red, and it's reasonable to expect that if we ask them what they're experiencing, their answer will involve the color red.

That's probably good enough for a cognitive scientist, but not a philosopher.  The philosopher may well insist that you don't know what the subject experienced, but only how they would answer a question.  They -- and for that matter any of your other subjects -- might just as well be philosophical zombies who exhibit all the expected behaviors and responses without actually experiencing anything.  We may know intuitively, but we can't prove that the test subjects aren't just like the self-driving car, only on a more elaborate level.


There are a couple of ways out of this.  One is to deny that qualia exist in any well-defined way.  From a logical point of view, this seems quite plausible.  We can talk about the abstract concept of redness, but in real life we don't experience redness in the abstract.  We experience a particular something red at a particular place and time.  That feels a particular way at that place and time, and quite possibly nothing has ever felt quite the same before or ever will.  Maybe we should just stick to our knitting and figure out what happens in real brains in response to real stimuli.  We can still generalize and define abstractions, but if we want an objective description of the world we have to start with objective data.

And yet, we still experience things, subjectively, each of us (or at least I'm pretty sure about me).

So how do we distinguish between a person at a stop light and a self-driving car?  Maybe we don't need to make a strong distinction.  Maybe we're ... not so different.

There's no particular reason, beyond our innate sense of specialness, to assume that only human beings can have experiences.  If we see a hungry dog, our intuition tells us the dog is experiencing hunger.  Our intuition is probably right.  The dog may not be having exactly the same kind of experience we do, but there's no reason to assume it's a philosophical zombie that only looks like it's experiencing hunger.

One way of handling this is to assert that along with the physical properties of the world -- mass, position, velocity and so forth -- there is an experiential component that's completely distinct but which we might still be able to reason about.  Perhaps we will even discover laws that govern it and develop a comprehensive theory of experience.

One objection to this approach is that it seems to imply panpsychism, the idea that everything has consciousness.  There are already schools of thought that believe exactly that, but the concept doesn't sit particularly well in materialist circles (materialist in the philosophical sense).

However, this seems misguided.  If consciousness in the sense of being able to experience qualia is a property in a way similar to mass being a property of things, that doesn't mean that everything has to have that property.  Just as photons are massless, there's no contradiction in saying a rock is unconscious.

Rather than stating that everything has consciousness, we are asserting that objects can have consciousness, and we are trying to investigate under what circumstances that happens.  However, we are explicitly punting on the question of how it has consciousness.  We are saying that when the conditions are right "it just does", just as when a particle interacts with the Higgs field it has mass* (I believe physics has a more detailed account of this than "it just does", but at some point even physics has to make some base assumptions).

From that point of view it's still reasonable to say that a rock has no feelings or consciousness, but a human does, a dog does and just possibly a self-driving car has some limited degree of consciousness as well.  Moreover we may be able to prove that in the scientific sense of having a coherent theory and data to support it.  If so, it seems this theory will look a lot like a purely material explanation of memory, attention and other aspects of consciousness, together with an assertion that when certain of these are present, the thing in which they are present experiences qualia.

What is it to be a self-driving car?  Probably not much, but perhaps something.

* [That's not a really rigorous way to phrase that, but I don't know well enough to give a better one --D.H.]