Simulating Emotion

I've written previously about the role of emotions as an organizing principle for guiding robotic behavior. Dr. Cynthia Breazeal and colleagues at MIT have taken a slightly different approach, in that they used emotion as a way to interface robots with the human world. One perhaps unexpected benefit of this approach is that these expressed emotions allow the robot to elicit desired human behaviors.

Their "expressive anthropomorphic" robot is named Kismet (pictured above) and has been designed to show a continuum of emotional responses to specific stimuli. For example, when it encounters an object that could be seen as threatening, it may first respond with interest, but will soon turn away from the aversive object, as demonstrated by this video. In situations such as social dialogue, Kismet can respond quite fluidly and naturally to proto-linguistic cues.

The implicit rationale of much of this research is Hull's drive reduction theory, which states that imbalances in homeostasis cause states of psychological arousal. These states of arousal (emotions) then compel us to restore our homeostatic balance, thereby reducing our drives. As Dr. Braetheal put it, "Kismet actually evaluates all the incoming stimuli with respect to: Is this beneficial to me or not? Is it going to satiate a drive that I need to satiate or not?" The answers to these questions evoke responses in different emotional systems.

So how close might Kismet's emotions be to "the real thing" that humans experience every day? From the videos linked above: "The emotions that have been programmed into the robot have been modeled - computationally modeled - after what we know about human emotions. So I would say it's more of a simplification and a subset of what our full-fledged human emotions really are ... By following ideas and theories from psychology, from developmental psychology, and from evolution, from all of these studies of natural systems, and putting these theories into the robot, has the advantage of making the robot's behavior familiar because it's essentially lifelike. It parallels that of natural systems."

Biologically-inspired robotic systems often display eerie, atavistic reenactments of the behavior of those natural systems on which they were modeled. Likewise, it is hard to mistake Kismet's neonatal qualities - not just the size of its eyes, but in its animate qualities as well. These similarities are not merely superficial: in important ways, Kismet is like a child. Many of the traits which Kismet displays are based on, or analogous to, those seen in human children.

Roughly speaking, Kismet is based on a six part architecture: feature extraction, attention, perception, motivation, behavior and motor systems. In feature extraction, Kismet specifically tracks eyes and distinct variations in vocal affect, just as human infants do. The attentional stage tags regions of particular interest, such as those that are changing rapidly or those objects that are brightly colored relative to their background. Both attentionally-tagged items and features are then combined in the perceptual system, which binds this information as a percept that may or may not activate a releaser, each of which has a relationship to particular emotional responses. These interact with motivation and behavior subsystems, which establish homeostasis among a number of different drives as well as the behaviors which can be used to regulate homeostasis. Finally, a motor system carries out the behaviors with the help of motor skills, facial animation, expressive vocalization, or oculo-motor movements. Much more detailed information on these various subsystems is available here.

Emotions are not typically considered mechanistic or functional - we tend to think of them as cognitive "byproducts," an evolutionary inheritance from ancestors capable only of feeling, and not of thinking. On the contrary, emotions may actually be an integral part of cognition. Below is a list of Kismet's emotions, and the functions each is thought to subserve; does this list accord with your subjective experience of these emotions?

  • Anger: to mobilize and sustain activity at high levels; low levels of anger (frustration) are useful when progress toward a goal is slow
  • Disgust: to create distance between one and an aversive stimulus
  • Fear: to motivate avoidance or escape responses
  • Joy: to broaden experience by encouraging social behavior and reward completion of a goal
  • Sorrow: to slow responses in cases of poor performance, so as to encourage reflection and behavior change
  • Surprise: to signal the occurence of an unpredicted event, so as to improve future attempts at event prediction
  • Interest: to motivate exploration and learning, and reinforce selective attention
  • Boredom: similar to interest, except its purpose is to force an encounter with a new stimulus, which might then elicit interest

Certainly one can't ascribe intrinsic "functions" to emotions, but it is clear that emotional deficits can cause changes in behavior - for this reason they must have some behavioral consequence, which we may assume is evolutionarily advantageous. While it may not be possible to describe exactly what these behavioral consequences are, it may actually be possible to test hypotheses about possible "functions" of emotions in the creation of autonomous robots, in order answer precisely these questions that are either impossible or unethical to test in humans.


Post a Comment

Links to this post:

Create a Link

<< Home