Loughborough University
Leicestershire, UK
LE11 3TU
+44 (0)1509 263171
Loughborough University

Loughborough University Institutional Repository

Please use this identifier to cite or link to this item: https://dspace.lboro.ac.uk/2134/20267

Title: Similarity K-d tree method for sparse point pattern matching with underlying non-rigidity
Authors: Li, Baihua
Meng, Qinggang
Holstein, Horst
Keywords: Dimensional tree
Non-rigid point pattern matching
Non-rigid pose estimation
Robust point pattern correspondence
Motion capture
Issue Date: 2005
Publisher: © IEEE
Citation: LI, B., MENG, Q. and HOLSTEIN, H., 2005. Similarity K-d tree method for sparse point pattern matching with underlying non-rigidity. Pattern Recognition, 38 (12), pp.2391-2399
Abstract: We propose a method for matching non-affinely related sparse model and data point-sets of identical cardinality, similar spatial distribution and orientation. To establish a one-to-one match, we introduce a new similarity K-dimensional tree. We construct the tree for the model set using spatial sparsity priority order. A corresponding tree for the data set is then constructed, following the sparsity information embedded in the model tree. A matching sequence between the two point sets is generated by traversing the identically structured trees. Experiments on synthetic and real data confirm that this method is applicable to robust spatial matching of sparse point-sets under moderate non-rigid distortion and arbitrary scaling, thus contributing to non-rigid point-pattern matching. © 2005 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Version: Accepted for publication
DOI: 10.1016/j.patcog.2005.03.004
URI: https://dspace.lboro.ac.uk/2134/20267
Publisher Link: http://dx.doi.org/10.1016/j.patcog.2005.03.004
ISSN: 0031-3203
Appears in Collections:Published Articles (Computer Science)

Files associated with this item:

File Description SizeFormat
li-kd-prj04-accepted.pdfAccepted version251.71 kBAdobe PDFView/Open

 

SFX Query

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.