Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality
One main challenge in Augmented Reality (AR) applications is to keep track of video objects with their movement, orientation, size, and position accurately. This poses a challenging task to recover nonrigid shape and global pose in real-time AR applications. This paper proposes a novel two-stage sch...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2006
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/2393 https://ink.library.smu.edu.sg/context/sis_research/article/3393/viewcontent/ECCV06_1105.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-3393 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-33932016-01-14T05:34:57Z Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality ZHU, Jianke HOI, Steven C. H. LYU, Michael R. One main challenge in Augmented Reality (AR) applications is to keep track of video objects with their movement, orientation, size, and position accurately. This poses a challenging task to recover nonrigid shape and global pose in real-time AR applications. This paper proposes a novel two-stage scheme for online non-rigid shape recovery toward AR applications using Active Appearance Models (AAMs). First, we construct 3D shape models from AAMs offline, which do not involve processing of the 3D scan data. Based on the computed 3D shape models, we propose an efficient online algorithm to estimate both 3D pose and non-rigid shape parameters via local bundle adjustment for building up point correspondences. Our approach, without manual intervention, can recover the 3D non-rigid shape effectively from either real-time video sequences or single image. The recovered 3D pose parameters can be used for AR registrations. Furthermore, the facial feature can be tracked simultaneously, which is critical for many face related applications. We evaluate our algorithms on several video sequences. Promising experimental results demonstrate our proposed scheme is effective and signifi- cant for real-time AR applications. 2006-05-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/2393 info:doi/10.1007/11744023_15 https://ink.library.smu.edu.sg/context/sis_research/article/3393/viewcontent/ECCV06_1105.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Computer Sciences Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Computer Sciences Databases and Information Systems |
spellingShingle |
Computer Sciences Databases and Information Systems ZHU, Jianke HOI, Steven C. H. LYU, Michael R. Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
description |
One main challenge in Augmented Reality (AR) applications is to keep track of video objects with their movement, orientation, size, and position accurately. This poses a challenging task to recover nonrigid shape and global pose in real-time AR applications. This paper proposes a novel two-stage scheme for online non-rigid shape recovery toward AR applications using Active Appearance Models (AAMs). First, we construct 3D shape models from AAMs offline, which do not involve processing of the 3D scan data. Based on the computed 3D shape models, we propose an efficient online algorithm to estimate both 3D pose and non-rigid shape parameters via local bundle adjustment for building up point correspondences. Our approach, without manual intervention, can recover the 3D non-rigid shape effectively from either real-time video sequences or single image. The recovered 3D pose parameters can be used for AR registrations. Furthermore, the facial feature can be tracked simultaneously, which is critical for many face related applications. We evaluate our algorithms on several video sequences. Promising experimental results demonstrate our proposed scheme is effective and signifi- cant for real-time AR applications. |
format |
text |
author |
ZHU, Jianke HOI, Steven C. H. LYU, Michael R. |
author_facet |
ZHU, Jianke HOI, Steven C. H. LYU, Michael R. |
author_sort |
ZHU, Jianke |
title |
Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
title_short |
Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
title_full |
Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
title_fullStr |
Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
title_full_unstemmed |
Real-Time Non-Rigid Shape Recovery via Active Appearance Models for Augmented Reality |
title_sort |
real-time non-rigid shape recovery via active appearance models for augmented reality |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2006 |
url |
https://ink.library.smu.edu.sg/sis_research/2393 https://ink.library.smu.edu.sg/context/sis_research/article/3393/viewcontent/ECCV06_1105.pdf |
_version_ |
1770572133082071040 |