Laboratory attributes
The core expertise of the iBUG group is the machine analysis of human behavior in space and time including face analysis, body gesture analysis, visual, audio, and multimodal analysis of human behavior, and biometrics analysis. Application areas in which the group is working are face analysis, body gesture analysis, audio and visual human behavior analysis, biometrics and behaviometrics, and HCI.
The Intelligent Behaviour Understanding Group engages in several areas of research.
Active areas of research include detection of facial features (e.g., geometric features like facial points, transient features like wrinkles, and dynamic features like texture changes), extraction of the facial expression information including muscle action detection and analysis of dynamics of facial expression, and interpretation of the expression information (e.g., in terms of emotions, social signals, person identity, etc.).
Active areas of research include tackling the problem of human body action detection in unconstrained scenery including moving camera, cluttered and/or dynamic background, and using the developed methodologies in various application domains including sport therapy, pain therapy, social signals detection, etc.
Active area of research is multimodal analysis and interpretation of human naturalistic (as opposed to deliberately displayed) behavior that can take into account non-linear correlations between multimodal cues (facial expressions, head and body gestures, and various vocalizations) in space and time. The main areas of applications include detection of nonverbal vocalizations like laughter, continuous and dimensional analysis of affective and mental states, and analysis of social signals.
Active areas of research include face recognition in 2D and 3D, person identification using facial-expression-based behaviometric, and combination thereof.
Active area of research is the design and development of human-centered interfaces which adapt automatically based on naturally occurring, multimodal, human behavior. Such interfaces include affect-sensitive HCI and systems for implicit (behavior-based) tagging of multimedia content.