MASNet: a robust deep marine animal segmentation network
Marine animal studies are of great importance to human beings and instrumental to many research areas. How to identify such animals through image processing is a challenging task that leads to marine animal segmentation (MAS). Although deep neural networks have been widely applied for object segment...
Saved in:
Main Authors: | , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/172510 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-172510 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1725102023-12-12T04:54:59Z MASNet: a robust deep marine animal segmentation network Fu, Zhenqi Chen, Ruizhe Huang, Yue Cheng, En Ding, Xinghao Ma, Kai-Kuang School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Marine Animal Segmentation Object Camouflage Marine animal studies are of great importance to human beings and instrumental to many research areas. How to identify such animals through image processing is a challenging task that leads to marine animal segmentation (MAS). Although deep neural networks have been widely applied for object segmentation, few of them consider the complex imaging condition in the water and the camouflage property of marine animals. To this end, a robust deep marine animal segmentation network is proposed in this article. Specifically, we design a new data augmentation strategy to randomly change the degradation and camouflage attributes of the original objects. With the augmentations, a fusion-based deep neural network constructed in a Siamese manner is trained to learn the shared semantic representations. Moreover, we construct a new large-scale real-world MAS data set for conducting extensive experiments. It consists of over 3000 images with various underwater scenes and objects. Each image is annotated with an object-level mask and assigned to a category. Extensive experimental results show that our method significantly outperforms 12 state-of-the-art methods both qualitatively and quantitatively. The work was supported in part by the National Natural Science Foundation of China under Grant 82172033, Grant U19B2031, Grant 61971369, Grant 52105126, Grant 82272071, and Grant 62271430. 2023-12-12T04:54:59Z 2023-12-12T04:54:59Z 2023 Journal Article Fu, Z., Chen, R., Huang, Y., Cheng, E., Ding, X. & Ma, K. (2023). MASNet: a robust deep marine animal segmentation network. IEEE Journal of Oceanic Engineering. https://dx.doi.org/10.1109/JOE.2023.3252760 1558-1691 https://hdl.handle.net/10356/172510 10.1109/JOE.2023.3252760 2-s2.0-85159796084 en IEEE Journal of Oceanic Engineering © 2023 IEEE. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Marine Animal Segmentation Object Camouflage |
spellingShingle |
Engineering::Electrical and electronic engineering Marine Animal Segmentation Object Camouflage Fu, Zhenqi Chen, Ruizhe Huang, Yue Cheng, En Ding, Xinghao Ma, Kai-Kuang MASNet: a robust deep marine animal segmentation network |
description |
Marine animal studies are of great importance to human beings and instrumental to many research areas. How to identify such animals through image processing is a challenging task that leads to marine animal segmentation (MAS). Although deep neural networks have been widely applied for object segmentation, few of them consider the complex imaging condition in the water and the camouflage property of marine animals. To this end, a robust deep marine animal segmentation network is proposed in this article. Specifically, we design a new data augmentation strategy to randomly change the degradation and camouflage attributes of the original objects. With the augmentations, a fusion-based deep neural network constructed in a Siamese manner is trained to learn the shared semantic representations. Moreover, we construct a new large-scale real-world MAS data set for conducting extensive experiments. It consists of over 3000 images with various underwater scenes and objects. Each image is annotated with an object-level mask and assigned to a category. Extensive experimental results show that our method significantly outperforms 12 state-of-the-art methods both qualitatively and quantitatively. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Fu, Zhenqi Chen, Ruizhe Huang, Yue Cheng, En Ding, Xinghao Ma, Kai-Kuang |
format |
Article |
author |
Fu, Zhenqi Chen, Ruizhe Huang, Yue Cheng, En Ding, Xinghao Ma, Kai-Kuang |
author_sort |
Fu, Zhenqi |
title |
MASNet: a robust deep marine animal segmentation network |
title_short |
MASNet: a robust deep marine animal segmentation network |
title_full |
MASNet: a robust deep marine animal segmentation network |
title_fullStr |
MASNet: a robust deep marine animal segmentation network |
title_full_unstemmed |
MASNet: a robust deep marine animal segmentation network |
title_sort |
masnet: a robust deep marine animal segmentation network |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/172510 |
_version_ |
1787136529089429504 |