In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for
critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to
detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos
captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action
Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific
Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient
individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is
better suited than the more commonly used classifier-based methods for application to the cancer patient videos in
which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points
that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to
face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units
in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained
FACS coders who independently reviewed and scored the action units in the cancer patient videos.
|