On the pooling of positive examples with ontology for visual concept learning
A common obstacle in effective learning of visual concept classifiers is the scarcity of positive training examples due to expensive labeling cost. This paper explores the sampling of weakly tagged web images for concept learning without human assistance. In particular, ontology knowledge is incorpo...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2011
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/6520 https://ink.library.smu.edu.sg/context/sis_research/article/7523/viewcontent/2072298.2071934.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Summary: | A common obstacle in effective learning of visual concept classifiers is the scarcity of positive training examples due to expensive labeling cost. This paper explores the sampling of weakly tagged web images for concept learning without human assistance. In particular, ontology knowledge is incorporated for semantic pooling of positive examples from ontologically neighboring concepts. This effectively widens the coverage of the positive samples with visually more diversified content, which is important for learning a good concept classifier. We experiment with two learning strategies: aggregate and incremental. The former strategy re-trains a new classifier by combining existing and newly collected examples, while the latter updates the existing model using the new samples incrementally. Extensive experiments on NUS-WIDE and VOC 2010 datasets show very encouraging results, even when comparing with classifiers learnt using expert labeled training examples. |
---|