Collaborative deep learning inference in edge-cloud computing

With deep learning become more and more popular in machine learning literature, more research is being done to apply such tools to commercial and business use[25]. One of the more recent developments that comes to mind is collaborative inference. The topic of achieving better latency with collaborat...

Full description

Saved in:
Bibliographic Details
Main Author: Lee, Martyn Eng Hui
Other Authors: Zhang Tianwei
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/148076
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:With deep learning become more and more popular in machine learning literature, more research is being done to apply such tools to commercial and business use[25]. One of the more recent developments that comes to mind is collaborative inference. The topic of achieving better latency with collaborative inference has been well-studied[3,7], however those tests were concluded with state-of-the-art mobile edges that isn’t found in commercial devices. With the advent of more powerful mobile GPUs, it is a natural step to consider such latency and load-saving techniques for mobile devices that are on the market these days. The result of this came with qualified positive results with collaborative inference still being viable for commercial devices under certain conditions despite its clear GPU deficiency to its state-of-the-art counterparts. There exist certain strategies to consider when applying collaborative inference to commercial devices.