Deep anomaly detection for medical images

Deep learning methods have been demonstrated to be effective in many medical tasks. However, these methods normally require a large number of labeled data, which is costly especially for disease screening, where the abnormal/diseased data are more difficult to obtain. The main purpose of this Fi...

全面介紹

Saved in:
書目詳細資料
主要作者: Li, Xintong
其他作者: Lin Zhiping
格式: Final Year Project
語言:English
出版: Nanyang Technological University 2020
主題:
在線閱讀:https://hdl.handle.net/10356/140534
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:Deep learning methods have been demonstrated to be effective in many medical tasks. However, these methods normally require a large number of labeled data, which is costly especially for disease screening, where the abnormal/diseased data are more difficult to obtain. The main purpose of this Final Year Project is to investigate transfer learning-based anomaly detection methods that do not require or only require a small number of labeled data instances. In this work, two anomaly detection methods are proposed. A semi-supervised joint learning method (SmSupJL) is proposed to train a feature extractor with two losses, namely, ’cross-entropy loss’ and ’intra-class variance loss’ on a small labeled train set. By applying these two losses, the feature extractor is able to learn discriminative features of normal and abnormal samples and keep the compactness of normal samples. To further reduce the number of labeled data instances needed, we propose an unsupervised domain adaptation method (UnSupDA) which does not require any labeled instances from target domain but a small number of labeled data instances from source domain to detect anomalies. Self-supervised tasks are used to align source domain and target domain and thus transfer the knowledge learned from the source domain to target domain. Experimental results evaluated on Kaggle Diabetic Retinopathy (DR) dataset demonstrated that the performance of these methods is either surpass or comparable to the current state-of-the-art.