Video-realistic image-based eye animation via statistically driven state machines

Research output: Contribution to journalArticleResearchpeer review

Authors

Research Organisations

View graph of relations

Details

Original languageEnglish
Pages (from-to)1201-1216
Number of pages16
JournalVisual Computer
Volume26
Publication statusPublished - 17 Nov 2009

Abstract

In this work we elaborate on a novel image-based system for creating video-realistic eye animations to arbitrary spoken output. These animations are useful to give a face to multimedia applications such as virtual operators in dialog systems. Our eye animation system consists of two parts: eye control unit and rendering engine, which synthesizes eye animations by combining 3D and image-based models. The designed eye control unit is based on eye movement physiology and the statistical analysis of recorded human subjects. As already analyzed in previous publications, eye movements vary while listening and talking. We focus on the latter and are the first to design a new model which fully automatically couples eye blinks and movements with phonetic and prosodic information extracted from spoken language. We extended the already known simple gaze model by refining mutual gaze to better model human eye movements. Furthermore, we improved the eye movement models by considering head tilts, torsion, and eyelid movements. Mainly due to our integrated blink and gaze model and to the control of eye movements based on spoken language, subjective tests indicate that participants are not able to distinguish between real eye motions and our animations, which has not been achieved before.

Keywords

    Computer vision, Eye animation, Sample-based image synthesis, Talking-heads

ASJC Scopus subject areas

Cite this

Video-realistic image-based eye animation via statistically driven state machines. / Weissenfeld, Axel; Liu, Kang; Ostermann, Jörn.
In: Visual Computer, Vol. 26, 17.11.2009, p. 1201-1216.

Research output: Contribution to journalArticleResearchpeer review

Weissenfeld A, Liu K, Ostermann J. Video-realistic image-based eye animation via statistically driven state machines. Visual Computer. 2009 Nov 17;26:1201-1216. doi: 10.1007/s00371-009-0401-x
Download
@article{2bbaae6cb05441f68337a2bc48ef0f06,
title = "Video-realistic image-based eye animation via statistically driven state machines",
abstract = "In this work we elaborate on a novel image-based system for creating video-realistic eye animations to arbitrary spoken output. These animations are useful to give a face to multimedia applications such as virtual operators in dialog systems. Our eye animation system consists of two parts: eye control unit and rendering engine, which synthesizes eye animations by combining 3D and image-based models. The designed eye control unit is based on eye movement physiology and the statistical analysis of recorded human subjects. As already analyzed in previous publications, eye movements vary while listening and talking. We focus on the latter and are the first to design a new model which fully automatically couples eye blinks and movements with phonetic and prosodic information extracted from spoken language. We extended the already known simple gaze model by refining mutual gaze to better model human eye movements. Furthermore, we improved the eye movement models by considering head tilts, torsion, and eyelid movements. Mainly due to our integrated blink and gaze model and to the control of eye movements based on spoken language, subjective tests indicate that participants are not able to distinguish between real eye motions and our animations, which has not been achieved before.",
keywords = "Computer vision, Eye animation, Sample-based image synthesis, Talking-heads",
author = "Axel Weissenfeld and Kang Liu and J{\"o}rn Ostermann",
year = "2009",
month = nov,
day = "17",
doi = "10.1007/s00371-009-0401-x",
language = "English",
volume = "26",
pages = "1201--1216",
journal = "Visual Computer",
issn = "0178-2789",
publisher = "Springer Verlag",

}

Download

TY - JOUR

T1 - Video-realistic image-based eye animation via statistically driven state machines

AU - Weissenfeld, Axel

AU - Liu, Kang

AU - Ostermann, Jörn

PY - 2009/11/17

Y1 - 2009/11/17

N2 - In this work we elaborate on a novel image-based system for creating video-realistic eye animations to arbitrary spoken output. These animations are useful to give a face to multimedia applications such as virtual operators in dialog systems. Our eye animation system consists of two parts: eye control unit and rendering engine, which synthesizes eye animations by combining 3D and image-based models. The designed eye control unit is based on eye movement physiology and the statistical analysis of recorded human subjects. As already analyzed in previous publications, eye movements vary while listening and talking. We focus on the latter and are the first to design a new model which fully automatically couples eye blinks and movements with phonetic and prosodic information extracted from spoken language. We extended the already known simple gaze model by refining mutual gaze to better model human eye movements. Furthermore, we improved the eye movement models by considering head tilts, torsion, and eyelid movements. Mainly due to our integrated blink and gaze model and to the control of eye movements based on spoken language, subjective tests indicate that participants are not able to distinguish between real eye motions and our animations, which has not been achieved before.

AB - In this work we elaborate on a novel image-based system for creating video-realistic eye animations to arbitrary spoken output. These animations are useful to give a face to multimedia applications such as virtual operators in dialog systems. Our eye animation system consists of two parts: eye control unit and rendering engine, which synthesizes eye animations by combining 3D and image-based models. The designed eye control unit is based on eye movement physiology and the statistical analysis of recorded human subjects. As already analyzed in previous publications, eye movements vary while listening and talking. We focus on the latter and are the first to design a new model which fully automatically couples eye blinks and movements with phonetic and prosodic information extracted from spoken language. We extended the already known simple gaze model by refining mutual gaze to better model human eye movements. Furthermore, we improved the eye movement models by considering head tilts, torsion, and eyelid movements. Mainly due to our integrated blink and gaze model and to the control of eye movements based on spoken language, subjective tests indicate that participants are not able to distinguish between real eye motions and our animations, which has not been achieved before.

KW - Computer vision

KW - Eye animation

KW - Sample-based image synthesis

KW - Talking-heads

UR - http://www.scopus.com/inward/record.url?scp=77955768894&partnerID=8YFLogxK

U2 - 10.1007/s00371-009-0401-x

DO - 10.1007/s00371-009-0401-x

M3 - Article

AN - SCOPUS:77955768894

VL - 26

SP - 1201

EP - 1216

JO - Visual Computer

JF - Visual Computer

SN - 0178-2789

ER -

By the same author(s)