Foundation: Information Technology Research Grant# 0085980
Multimodal Computer-aided Learning System
The goal of this project is to contribute to the development of a
human-computer interaction environment in which the computer detects and
tracks the user's emotional, motivational, cognitive and task states, and
initiates communications based on this knowledge, rather than simply
responding to user commands. The research issues include:
- User state detection:
Development and testing of methodologies and algorithms to obtain and
fuse information from multimodal sensors to deduce the human user's
cognitive, emotional, and motivational states.
- User state tracking:
Development of a system for tracking the user's states dynamically in
real time, and using this information to infer user strategies and
- User state theory and
proactive communication. Developing a theoretical framework of user
states and communication effects that can guide proactive and responsive
computer action decisions.
communication. Developing effective animated visual/audio Avatars that
can exhibit facial expression and produce affective synthesized speech.
- Computer learning.
Implementing computer learning and reasoning capability that, from user
state feedback, improves the usefulness of the system
The ideas and tools resulting from the research will be evaluated in an
educational environment. This Testbed concerns the learning of math and
science in upper elementary and middle school children, with
over-representation of females and children from minority groups. Of course,
the results will be widely applicable to many domains, not limited to this
The proposed research is high-risk and ambitious. However, we emphasize and
firmly believe that partial results from the various components of the
research can be integrated to create a meaningful Proactive Computer
environment, and that this will allow the study of the effects of this type
of environment to begin.