Hierarchical feature attention with bottleneck attention modules for multi-branch classification
While existing attention mechanisms often focus on pre-processing images, fine-grained classification tasks benefit from leveraging hierarchical relationships within categories. For example, classifying bird species involves understanding broader categories like orders and families. This inherent...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/177332 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-177332 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1773322024-05-31T15:44:18Z Hierarchical feature attention with bottleneck attention modules for multi-branch classification Gan, Ryan Jiang Xudong School of Electrical and Electronic Engineering EXDJiang@ntu.edu.sg Engineering While existing attention mechanisms often focus on pre-processing images, fine-grained classification tasks benefit from leveraging hierarchical relationships within categories. For example, classifying bird species involves understanding broader categories like orders and families. This inherent structure helps reduce ambiguity in predictions. This work proposes a novel approach that integrates Bottleneck Attention Mechanisms (BAM) within a ResNet50 backbone for multi-task classification. By employing separate feature branches tailored to each task and applying BAM after each branch, the model learns more discriminative features for each hierarchy. This report details the architecture and training strategy of this proposed model. Bachelor's degree 2024-05-28T00:47:51Z 2024-05-28T00:47:51Z 2024 Final Year Project (FYP) Gan, R. (2024). Hierarchical feature attention with bottleneck attention modules for multi-branch classification. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/177332 https://hdl.handle.net/10356/177332 en A3072-231 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering |
spellingShingle |
Engineering Gan, Ryan Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
description |
While existing attention mechanisms often focus on pre-processing images,
fine-grained classification tasks benefit from leveraging hierarchical relationships
within categories. For example, classifying bird species involves understanding
broader categories like orders and families. This inherent structure helps reduce
ambiguity in predictions.
This work proposes a novel approach that integrates Bottleneck Attention
Mechanisms (BAM) within a ResNet50 backbone for multi-task classification. By
employing separate feature branches tailored to each task and applying BAM after
each branch, the model learns more discriminative features for each hierarchy. This
report details the architecture and training strategy of this proposed model. |
author2 |
Jiang Xudong |
author_facet |
Jiang Xudong Gan, Ryan |
format |
Final Year Project |
author |
Gan, Ryan |
author_sort |
Gan, Ryan |
title |
Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
title_short |
Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
title_full |
Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
title_fullStr |
Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
title_full_unstemmed |
Hierarchical feature attention with bottleneck attention modules for multi-branch classification |
title_sort |
hierarchical feature attention with bottleneck attention modules for multi-branch classification |
publisher |
Nanyang Technological University |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/177332 |
_version_ |
1814047253129592832 |