Faculty Funded Work

Multimodal Detection of Affect Using Mobile Devices

Dr. Cynthia Howard


Current methods for researching and modeling affect require peripheral devices, many of which are uncomfortable for or disconcerting to the user. To more realistically identify predictors of certain affective states, we are developing an application that will use the multiple sensors that are incorporated into mobile devices, such as cameras, pressure sensors, accelerometers, and gyroscopes, to collect data from user interactions with a task that is designed to induce user frustration. Data collected from these interactions will then be used to derive a predictive model of frustration. Such a model could be incorporated into applications such as intelligent tutoring systems, where frustration has been shown to impede learning. Having knowledge of the user affective state could lead to initiating actions to alleviate stress and enhance learning.