Loughborough University
Leicestershire, UK
LE11 3TU
+44 (0)1509 263171
Loughborough University

Loughborough University Institutional Repository

Please use this identifier to cite or link to this item: https://dspace.lboro.ac.uk/2134/10174

Title: Face pose estimation from eyes and mouth
Authors: Shafi, Muhammad
Chung, Paul Wai Hing
Keywords: Human-computer interaction
Mouth map
Eye map
Pose estimation
Issue Date: 2010
Publisher: © Inderscience Enterprises Ltd.
Citation: SHAFI, M. and CHUNG, P.W.H., 2010. Face pose estimation from eyes and mouth. International Journal of Mechatronic Systems, 2 (1/2), pp. 132 - 138.
Abstract: Face pose estimation plays an important role in human computer interaction, automatic human behaviour analysis, gaze estimation, virtual reality, pose independent face recognition, etc. Accuracy and speed are the most desirable features of a face pose estimation system. In this paper, a face pose estimation scheme based on the centres of the eyes and mouth is proposed. The proposed method is simple and is, therefore, very effective in terms of computation because it uses only three points, i.e., eyes and mouth centres. The use of only three points increases the pose estimation range and makes the method suitable for real time applications. Tests using the Pointing '04 database show that the proposed scheme is robust and fast.
Description: This article was published in the journal, International Journal of Advanced Mechatronic Systems (IJAMECHS)[© Inderscience Enterprises Ltd] and the definitive version is available at http://dx.doi.org/10.1504/IJAMECHS.2010.030857
Version: Accepted for publication
DOI: 10.1504/IJAMECHS.2010.030857
URI: https://dspace.lboro.ac.uk/2134/10174
Publisher Link: http://dx.doi.org/10.1504/IJAMECHS.2010.030857
ISSN: 1756-8412
Appears in Collections:Published Articles (Computer Science)

Files associated with this item:

File Description SizeFormat
FacePoseEstimationFromEyesAndMouth.pdf1.08 MBAdobe PDFView/Open


SFX Query

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.