Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset

Rapid advancement in Transmission Electron Microscopy (TEM) instrumentation has led to better acquisition of high-resolution, nanoscale images, allowing material scientists to obtain in-depth analysis of material samples with complex designs. Concurrently, however, it has resulted in highly mixed da...

Full description

Saved in:
Bibliographic Details
Main Author: Quang, Uy Thinh
Other Authors: Martial Duchamp
Format: Final Year Project
Language:English
Published: 2018
Subjects:
Online Access:http://hdl.handle.net/10356/73745
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-73745
record_format dspace
spelling sg-ntu-dr.10356-737452023-03-04T15:38:20Z Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset Quang, Uy Thinh Martial Duchamp School of Materials Science and Engineering DRNTU::Engineering Rapid advancement in Transmission Electron Microscopy (TEM) instrumentation has led to better acquisition of high-resolution, nanoscale images, allowing material scientists to obtain in-depth analysis of material samples with complex designs. Concurrently, however, it has resulted in highly mixed dataset. In other words, each pixel of the imaged sample would be a combination of multiple signals from constituent elements and phases. Data separation, or unmixing, of this mixed image, would be required for tasks including quantification and identification. This project involves two computational algorithms developed for such a purpose: Vertex Component Analysis (VCA) and Bayesian Linear Unmixing (BLU). The project first focused on the implementation of these algorithms into HyperSpy, an open source analytical imaging toolbox developed in Python language. The new code scripts for both techniques were designed independently, and incorporated into existing software scripts such that they could fully utilize the functionalities available in HyperSpy. The implementation was confirmed to be operational via verification tests using sample EDX and EELS images, ensuring that the codes did not produce random unmixing outputs. The project’s second phase studied dataset pre-treatment techniques using highly noise-corrupted EDX images, and compared the unmixing performance between BLU and VCA. The images employed were that of a methylammonium lead iodide (MAPbI3) perovskite film, and In(Zn)P/ZnS core-shell nanocrystal. A permutated combination of three pre-treatment methods, namely binning, cropping and normalization were applied on the images. Binning was used to boost signals through reducing the image resolution while cropping targeted the region of interest in an image to avoid irrelevant signals. Normalization, finally, dealt with the shot-noise nature of EDX images. It was found that a concurrent combination of the three methods produced the optimal unmixing outputs for BLU and VCA. Furthermore, a synthetic dataset was created via HyperSpy to test the Signal-to-Noise Ratio (SNR) dependence of BLU and VCA. Interestingly, BLU had a larger margin of unmixing error compared to VCA, but in heavily-noise corrupted conditions, BLU performed marginally better. Overall, however, it appeared that VCA excelled with lighter resource demand, faster processing time and reasonably accurate unmixing output. Bachelor of Engineering (Materials Engineering) 2018-04-06T06:50:55Z 2018-04-06T06:50:55Z 2018 Final Year Project (FYP) http://hdl.handle.net/10356/73745 en Nanyang Technological University 65 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering
spellingShingle DRNTU::Engineering
Quang, Uy Thinh
Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
description Rapid advancement in Transmission Electron Microscopy (TEM) instrumentation has led to better acquisition of high-resolution, nanoscale images, allowing material scientists to obtain in-depth analysis of material samples with complex designs. Concurrently, however, it has resulted in highly mixed dataset. In other words, each pixel of the imaged sample would be a combination of multiple signals from constituent elements and phases. Data separation, or unmixing, of this mixed image, would be required for tasks including quantification and identification. This project involves two computational algorithms developed for such a purpose: Vertex Component Analysis (VCA) and Bayesian Linear Unmixing (BLU). The project first focused on the implementation of these algorithms into HyperSpy, an open source analytical imaging toolbox developed in Python language. The new code scripts for both techniques were designed independently, and incorporated into existing software scripts such that they could fully utilize the functionalities available in HyperSpy. The implementation was confirmed to be operational via verification tests using sample EDX and EELS images, ensuring that the codes did not produce random unmixing outputs. The project’s second phase studied dataset pre-treatment techniques using highly noise-corrupted EDX images, and compared the unmixing performance between BLU and VCA. The images employed were that of a methylammonium lead iodide (MAPbI3) perovskite film, and In(Zn)P/ZnS core-shell nanocrystal. A permutated combination of three pre-treatment methods, namely binning, cropping and normalization were applied on the images. Binning was used to boost signals through reducing the image resolution while cropping targeted the region of interest in an image to avoid irrelevant signals. Normalization, finally, dealt with the shot-noise nature of EDX images. It was found that a concurrent combination of the three methods produced the optimal unmixing outputs for BLU and VCA. Furthermore, a synthetic dataset was created via HyperSpy to test the Signal-to-Noise Ratio (SNR) dependence of BLU and VCA. Interestingly, BLU had a larger margin of unmixing error compared to VCA, but in heavily-noise corrupted conditions, BLU performed marginally better. Overall, however, it appeared that VCA excelled with lighter resource demand, faster processing time and reasonably accurate unmixing output.
author2 Martial Duchamp
author_facet Martial Duchamp
Quang, Uy Thinh
format Final Year Project
author Quang, Uy Thinh
author_sort Quang, Uy Thinh
title Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
title_short Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
title_full Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
title_fullStr Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
title_full_unstemmed Implementation of machine learning techniques to denoise and unmix TEM spectroscopic dataset
title_sort implementation of machine learning techniques to denoise and unmix tem spectroscopic dataset
publishDate 2018
url http://hdl.handle.net/10356/73745
_version_ 1759854555984560128