Joint weakly supervised image emotion analysis based on interclass discrimination and intraclass correlation
Regional information-based image emotion analysis has recently garnered significant attention. However, existing methods often focus on identifying region proposals through layered steps or merely rely on visual saliency. These approaches may lead to an underestimation of emotional categories and a...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2024
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9511 https://ink.library.smu.edu.sg/context/sis_research/article/10511/viewcontent/2024_5_Joint_Weakly_Supervised_Image_Emotion_Analysis_Based_on_Interclass_Discrimination_and_Intraclass_Correlation.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Summary: | Regional information-based image emotion analysis has recently garnered significant attention. However, existing methods often focus on identifying region proposals through layered steps or merely rely on visual saliency. These approaches may lead to an underestimation of emotional categories and a lack of comprehensive interclass discrimination perception and emotional intraclass contextual mining. To address these limitations, we propose a novel approach named InterIntraIEA, which combines interclass discrimination and intraclass correlation joint learning capabilities for image emotion analysis. The proposed method not only employs category-specific dictionary learning for class adaptation, but also models intraclass contextual relationships and perceives correlations at the channel level. This refinement process improves interclass descriptive ability and enhances emotional categories, resulting in the production of pseudomaps that provide more precise emotional region information. These pseudomaps, in conjunction with top-level features extracted from a multiscale extractor, are then input into a weakly supervised fusion module to predict emotional sentiment categories. |
---|