Sunday, August 21, 2011

And then it became self-aware

Something in our mind likes magic thresholds -- crisp clear dividing lines, to one side of which is X and to the other side not-X.  The world has other notions.  Accepting this takes continual effort.

When I was first learning how logic gates worked, my mathematical mind was enchanted by the clean symbolism of boolean logic, its Ands, Ors and Nots dancing their beautifully symmetrical algebraic ballet, its truth tables laying out precisely how the various operators combined True and False.

I would spend hours poring over component catalogs, drawing circuit diagrams full of gates and lines and little circles representing Not.  I had some notion of how those gates broke down into individual transistors, a transistor being an idealized beast that modulated perfect high and low voltages with other perfect high and low voltages.

And then I started looking at the technical specs more closely.  With growing discomfort I came to realize that there simply is no perfect step function from low to high.  The transition in the middle might be more or less exponential, but it is not perfectly vertical.  As I struggled to understand flip-flops and latches, I puzzled over metastable states and propagation delays.  Those weren't on the pretty circuit diagrams, were they?  There came a time when my eye could no longer filter out the symbols for resistors and capacitors sprinkled among the transistors -- and then there were those stowaway analog components like operational amplifiers skulking around, daring to use the same transistors as the digital circuits.  What had happened to my digital world?  When you got down to it, it was all analog at heart.

I could cite several other cases of learning that simple on/off distinctions generally don't hold up to close scrutiny, but one more will suffice.  From time to time, sometimes in classrooms but usually not, I would try to learn to draw, something I'm still not at all good at.  Along the way, studying shading, I learned the old saw that there are no lines in nature.  Where one might draw a line in a sketch or cartoon, there was actually a sharp, but not perfectly sharp, change in shading.  It was the eye that inferred a line, the same eye that could therefore accept a line drawing as realistic even when, objectively, it was anything but.


Understanding of intelligence, whether natural or artificial, can suffer by the same tendency to create lines where none exist.  It's tempting to try to come up with a clear, crisp definition of intelligence, but intelligence is not a binary attribute.  There are many different ways to be intelligent, some of which can manifest to significantly varying degrees.   Cognitive science has identified scores of intelligent behaviors, from counting to recognizing faces to remembering a path and far beyond.

Most notions of intelligence require the ability to learn, but what's learning?  The best answer I know is that there are many kinds of learning, just as there are many aspects to intelligence -- and there is quite likely no simple relationship between the two.

Which brings me to the title.  A recurring motif in science fiction and its cousins is the notion of a machine becoming self-aware and therefore, by a commonly-accepted notion of intelligence, intelligent. This magical moment brings us spine-tinglingly near the very engines of creation, to say nothing of providing an infinitely more formidable opponent for Our Hero.  That's fine for plot purposes, but just as there are many kinds of learning and intelligence, there must many sorts of awareness, self- or otherwise.

For example, many things with eyes react to other things with eyes watching them, in some cases even playing it to their advantage.  Without trying to put together a nice crisp definition of awareness -- after all my whole point here is that such definitions never stand up to a good round of "But what about ..?" -- I will posit that a bird watching you watch it is in some sense aware of you.

Statements like that can cause a certain discomfort among human readers because we all agree, quite possibly correctly, that a bird is not aware of the world in the same way we are.  If awareness is a binary attribute then, perforce, birds must not have it, because we do have awareness and birds don't have the same awareness we have.  QED.  Unfortunately, as airtight as that logic may be, it doesn't really tell us much.  We already knew birds weren't humans.

If, however, we allow that there may be many kinds of awareness, we can make fairly concrete assertions, in fact more detailed and meaningfully testable assertions, without getting backed into logical corners.  For example, if we assert that there is such a thing as watching -- actively behaving so as to keep something in sight, say -- and there is also awareness of being watched -- leaving aside what exactly that might comprise -- we can assert that both we and birds have those capabilities without saying that we apprehend the world the same way birds do.

There are many sorts of awareness that we share with birds and many other kinds of animal.  For example, many animals can recognize individuals, reacting differently depending on whether the other party is a stranger or familiar.  Both we and birds can be aware of where things are hidden, and in fact some species of bird appear to be much better at that than we are.  Both we and they can find our way from point A to point B and back and remember new routes that we find.

This is leaving aside a host of simple capabilities that seem too trivial to note until one realizes that not every living thing has them:  For example, knowing that some things are safe to eat and some aren't, that some animals are liable to attack you and some aren't, that there are objects in the world and we can manipulate them, that things dropped tend to fall, and so forth.

So how do we differ from birds in awareness?  For one thing, birds probably have some sorts of awareness that we lack.  Migratory birds appear to be aware of the strength and orientation of the Earth's magnetic field, and flying birds in general must surely have a richer awareness of three-dimensional space than we do.

Likewise, of course, we must surely be aware of things that birds aren't, but once we get done congratulating ourselves on being such vastly more sophisticated creatures, what would those things be?

A bird may be aware of the local magnetic field, but I'll boldly assert here that it isn't aware that said field is caused by electric currents in the Earth's outer core.  Fine, but just what is it here that we have that they lack that allows us to be aware of such things?  If you want to say "abstract concepts", bear in mind that at least some birds can count and appear to distinguish "same" from "different".  Also bear in mind that not every human is aware of such things (I had to look up the part about it being the outer core), so we're probably grasping at some sort of abstract awareness of cause and effect.  I'm not denying that there's something there, but we do have to be careful trying to define what it is.  Just saying "it's abstract" doesn't really help.


Here's a stab at something more like what our hypothetical AI villain would have to be able grasp in order to become the dangerously-aware creature we'll pay ten bucks to see:
Last week, John met Martha at a party on a boat on Lake Michigan.  It turned out that they had grown up within a mile of each other, but never known it.
From that short paragraph, you now know not only where John and Martha met, and when, and that they grew up close together without knowing of each other, but also that I know that John and Martha know that fact, but they hadn't until last week, and I know that you now know that, and ... well, you get the drift.  This is the sort of awareness that seems, if not completely unique to humans, rare in the animal world.  It's the sort of awareness that can make one a cunning adversary.  If you don't know that I know you're sneaking up on me, I may well have a crucial tactical advantage.

But is it self-awareness?  There is a famous experiment in which an animal is given access to a mirror.  All animals tend to react to the animal in the mirror as a different animal initially -- this includes humans who haven't seen a mirror before.  Some animals, however, will eventually start to behave differently, for example by poking at a spot painted on their forehead or positioning the mirror or themselves in order to see places they can't ordinarily see.

Animals that can do so include humans, bonobos, chimpanzees and orangutans, but also bottlenose dolphins, orcas and European magpies.  On the other hand most animals, including ones much more closely related to these animals than they are to each other, don't seem to be able to make the same leap.  Nor, for that matter, can humans less than about eighteen months old.

We may as well call mirror-test awareness self-awareness, but clearly passing the mirror test doesn't necessarily mean being able to make the kind of I-know-you-know inference described above.   It's also at least logically possible to reason sophisticatedly about who knows what without being able to pass the mirror test.  In short, just as there are many kinds of awareness, there are mostly likely many kinds of self awareness.

What we're really looking for here goes by the name "Theory of Mind", which is a good topic for another post ...