Collaborative deep learning inference in edge-cloud computing

With deep learning become more and more popular in machine learning literature, more research is being done to apply such tools to commercial and business use[25]. One of the more recent developments that comes to mind is collaborative inference. The topic of achieving better latency with collaborat...

全面介紹

Saved in:
書目詳細資料
主要作者: Lee, Martyn Eng Hui
其他作者: Zhang Tianwei
格式: Final Year Project
語言:English
出版: Nanyang Technological University 2021
主題:
在線閱讀:https://hdl.handle.net/10356/148076
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:With deep learning become more and more popular in machine learning literature, more research is being done to apply such tools to commercial and business use[25]. One of the more recent developments that comes to mind is collaborative inference. The topic of achieving better latency with collaborative inference has been well-studied[3,7], however those tests were concluded with state-of-the-art mobile edges that isn’t found in commercial devices. With the advent of more powerful mobile GPUs, it is a natural step to consider such latency and load-saving techniques for mobile devices that are on the market these days. The result of this came with qualified positive results with collaborative inference still being viable for commercial devices under certain conditions despite its clear GPU deficiency to its state-of-the-art counterparts. There exist certain strategies to consider when applying collaborative inference to commercial devices.