Natural two view learning

Co-training is a semi supervised learning method that effectively learns from a pool of labeled and unlabeled data and takes advantage of redundancy in the feature set. Co-training is known to work well when the assumptions it makes on the feature set hold true. In this project, we investigated the...

Full description

Saved in:
Bibliographic Details
Main Author: Dubey, Rachit.
Other Authors: Wu Jianxin
Format: Final Year Project
Language:English
Published: 2012
Subjects:
Online Access:http://hdl.handle.net/10356/48591
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-48591
record_format dspace
spelling sg-ntu-dr.10356-485912023-03-03T20:50:26Z Natural two view learning Dubey, Rachit. Wu Jianxin School of Computer Engineering Centre for Computational Intelligence DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Co-training is a semi supervised learning method that effectively learns from a pool of labeled and unlabeled data and takes advantage of redundancy in the feature set. Co-training is known to work well when the assumptions it makes on the feature set hold true. In this project, we investigated the use of co-training for action recognition for the elderly people. Experimental results showed that co-training was able to boost the performance of action recognition with a very few number of labeled samples. In this project, we also present a new co-training strategy – natural two view learning which doesn’t require the prior existence of two redundant views. Our proposed strategy doesn’t require the feature set to be described with sufficient and redundant view and hence can be applied to a broader class of problems. The Experiments on UCI data sets indicates that the proposed natural two view learning algorithm improves classification accuracy especially when the number of labeled examples is very few. Bachelor of Engineering (Computer Science) 2012-04-27T00:59:35Z 2012-04-27T00:59:35Z 2012 2012 Final Year Project (FYP) http://hdl.handle.net/10356/48591 en Nanyang Technological University 47 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
spellingShingle DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Dubey, Rachit.
Natural two view learning
description Co-training is a semi supervised learning method that effectively learns from a pool of labeled and unlabeled data and takes advantage of redundancy in the feature set. Co-training is known to work well when the assumptions it makes on the feature set hold true. In this project, we investigated the use of co-training for action recognition for the elderly people. Experimental results showed that co-training was able to boost the performance of action recognition with a very few number of labeled samples. In this project, we also present a new co-training strategy – natural two view learning which doesn’t require the prior existence of two redundant views. Our proposed strategy doesn’t require the feature set to be described with sufficient and redundant view and hence can be applied to a broader class of problems. The Experiments on UCI data sets indicates that the proposed natural two view learning algorithm improves classification accuracy especially when the number of labeled examples is very few.
author2 Wu Jianxin
author_facet Wu Jianxin
Dubey, Rachit.
format Final Year Project
author Dubey, Rachit.
author_sort Dubey, Rachit.
title Natural two view learning
title_short Natural two view learning
title_full Natural two view learning
title_fullStr Natural two view learning
title_full_unstemmed Natural two view learning
title_sort natural two view learning
publishDate 2012
url http://hdl.handle.net/10356/48591
_version_ 1759853402270990336