Hierarchical feature attention with bottleneck attention modules for multi-branch classification
While existing attention mechanisms often focus on pre-processing images, fine-grained classification tasks benefit from leveraging hierarchical relationships within categories. For example, classifying bird species involves understanding broader categories like orders and families. This inherent...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/177332 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | While existing attention mechanisms often focus on pre-processing images,
fine-grained classification tasks benefit from leveraging hierarchical relationships
within categories. For example, classifying bird species involves understanding
broader categories like orders and families. This inherent structure helps reduce
ambiguity in predictions.
This work proposes a novel approach that integrates Bottleneck Attention
Mechanisms (BAM) within a ResNet50 backbone for multi-task classification. By
employing separate feature branches tailored to each task and applying BAM after
each branch, the model learns more discriminative features for each hierarchy. This
report details the architecture and training strategy of this proposed model. |
---|