Reading Pain in a Human Face

By JAN HOFFMAN

29FACE tmagArticle

How well can computers interact with humans? Certainly computers play a mean game of chess, which requires strategy and logic, and “Jeopardy!,” in which they must process language to understand the clues read by Alex Trebek (and buzz in with the correct question).

But in recent years, scientists have striven for an even more complex goal: programming computers to read human facial expressions.

The practical applications could be profound. Computers could supplement or even replace lie detectors. They could be installed at border crossings and airport security checks. They could serve as diagnostic aids for doctors.

Researchers at the University of California, San Diego, have written software that not only detected whether a person’s face revealed genuine or faked pain, but did so far more accurately than human observers.

While other scientists have already refined a computer’s ability to identify nuances of smiles and grimaces, this may be the first time a computer has triumphed over humans at reading their own species.

“A particular success like this has been elusive,” said Matthew A. Turk, a professor of computer science at the University of California, Santa Barbara. “It’s one of several recent examples of how the field is now producing useful technologies rather than research that only stays in the lab. We’re affecting the real world.”

People generally excel at using nonverbal cues, including facial expressions, to deceive others (hence the poker face). They are good at mimicking pain, instinctively knowing how to contort their features to convey physical discomfort.

And other people, studies show, typically do poorly at detecting those deceptions.

In a new study, in Current Biology, by researchers at San Diego, the University of Toronto and the State University of New York at Buffalo, humans and a computer were shown videos of people in real pain or pretending. The computer differentiated suffering from faking with greater accuracy by tracking subtle muscle movement patterns in the subjects’ faces.

“We have a fair amount of evidence to show that humans are paying attention to the wrong cues,” said Marian S. Bartlett, a research professor at the Institute for Neural Computation at San Diego and the lead author of the study.

For the study, researchers used a standard protocol to produce pain, with individuals plunging an arm in ice water for a minute (the pain is immediate and genuine but neither harmful nor protracted). Researchers also asked the subjects to dip an arm in warm water for a moment and to fake an expression of pain.

Observers watched one-minute silent videos of those faces, trying to identify who was in pain and who was pretending. Only about half the answers were correct, a rate comparable to guessing.

Then researchers provided an hour of training to a new group of observers. They were shown videos, asked to guess who was really in pain, and told immediately whom they had identified correctly. Then the observers were shown more videos and again asked to judge. But the training made little difference: The rate of accuracy scarcely improved, to 55 percent.

Then a computer took on the challenge. Using a program that the San Diego researchers have named CERT, for computer expression recognition toolbox, it measured the presence, absence and frequency of 20 facial muscle movements in each of the 1,800 frames of one-minute videos. The computer assessed the same 50 videos that had been shown to the original, untrained human observers.

The computer learned to identify cues that were so small and swift that they eluded the human eye. Although the same muscles were often engaged by fakers and those in real pain, the computer could detect speed, smoothness and duration of the muscle contractions that pointed toward or away from deception. When the person was experiencing real pain, for instance, the length of time the mouth was open varied; when the person faked pain, the time the mouth opened was regular and consistent. Other combinations of muscle movements were the furrowing between eyebrows, the tightening of the orbital muscles around the eyes, and the deepening of the furrows on either side of the nose.

The computer’s accuracy: about 85 percent.

Jeffrey Cohn, a University of Pittsburgh professor of psychology who also conducts research on computers and facial expressions, said the CERT study addressed “an important problem, medically and socially,” referring to the difficulty of assessing patients who claim to be in pain. But he noted that the study’s observers were university students, not pain specialists.

Dr. Bartlett said she didn’t mean to imply that doctors or nurses do not perceive pain accurately. But “we shouldn’t assume human perception is better than it is,” she said. “There are signals in nonverbal behavior that our perceptual system may not detect or we don’t attend to them.”

Dr. Turk said that among the study’s limitations were that all the faces had the same frontal view and lighting. “No one is wearing sunglasses or hasn’t shaved for five days,” he said.

Dr. Bartlett and Dr. Cohn are working on applying facial expression technology to health care. Dr. Bartlett is working with a San Diego hospital to refine a program that will detect pain intensity in children.

“Kids don’t realize they can ask for pain medication, and the younger ones can’t communicate,” she said. A child could sit in front of a computer camera, she said, referring to a current project, and “the computer could sample the child’s facial expression and get estimates of pain. The prognosis is better for the patient if the pain is managed well and early.”

Dr. Cohn noted that his colleagues have been working with the University of Pittsburgh Medical Center’s psychiatry department, focusing on severe depression. One project is for a computer to identify changing patterns in vocal sounds and facial expressionsthroughout a patient’s therapy as an objective aid to the therapist.

“We have found that depression in the facial muscles serves the function of keeping others away, of signaling, ‘Leave me alone,’ ” Dr. Cohn said. The tight-lipped smiles of the severely depressed, he said, were tinged with contempt or disgust, keeping others at bay.

“As they become less depressed, their faces show more sadness,” he said. Those expressions reveal that the patient is implicitly asking for solace and help, he added. That is one way the computer can signal to the therapist that the patient is getting better.

Source: Nytimes.com

Recent Posts

Breaking the Mold: The Growing Role of Men in Nursing
Historically, Nursing has been viewed as a female-dominated profession, but a noticeable shift is occurring. The number of men entering the Nursing field has significantly increased in recent years....
Read More
Foot Care Nurses: Saving Limbs, One Foot at a Time
Foot Care Nursing is a specialized area within the Nursing profession that remains relatively unknown to many practicing Nurses. Foot Care Nurses are employed in various settings, including long-term...
Read More
Virtual Nursing - The Pros and Cons
Virtual Nursing has emerged as a groundbreaking innovation, leveraging technology to reimagine traditional Nursing practices. As healthcare systems face growing demands, virtual Nursing offers a...
Read More

Subscribe to Email Our Newsletter

Education_Award_Square