Poster: Profiling event vision processing on edge devices

As RGB camera resolutions and frame-rates improve, their increased energy requirements make it challenging to deploy fast, efficient, and low-power applications on edge devices. Newer classes of sensors, such as the biologically inspired neuromorphic event-based camera, capture only changes in light...

Full description

Saved in:
Bibliographic Details
Main Authors: GOKARN, Ila Nitin, MISRA, Archan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9228
https://ink.library.smu.edu.sg/context/sis_research/article/10228/viewcontent/Mobisys2024_ProfilingEventVision_PosterPaper_cameraready.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10228
record_format dspace
spelling sg-smu-ink.sis_research-102282024-08-27T02:45:29Z Poster: Profiling event vision processing on edge devices GOKARN, Ila Nitin MISRA, Archan As RGB camera resolutions and frame-rates improve, their increased energy requirements make it challenging to deploy fast, efficient, and low-power applications on edge devices. Newer classes of sensors, such as the biologically inspired neuromorphic event-based camera, capture only changes in light intensity per-pixel to achieve operational superiority in sensing latency (O(μs)), energy consumption (O(mW)), high dynamic range (140dB), and task accuracy such as in object tracking, over traditional RGB camera streams. However, highly dynamic scenes can yield an event rate of up to 12MEvents/second, the processing of which could overwhelm resource-constrained edge devices. Efficient processing of high volumes of event data is crucial for ultra-fast machine vision on edge devices. In this poster, we present a profiler that processes simulated event streams from RGB videos into 6 variants of framed representations for DNN inference on an NVIDIA Jetson Orin AGX, a representative edge device. The profiler evaluates the trade-offs between the volume of events evaluated, the quality of the processed event representation, and processing time to present the design choices available to an edge-scale event camera-based application observing the same RGB scenes. We believe that this analysis opens up the exploration of novel system designs for real-time low-power event vision on edge devices. 2024-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9228 info:doi/10.1145/3643832.3661415 https://ink.library.smu.edu.sg/context/sis_research/article/10228/viewcontent/Mobisys2024_ProfilingEventVision_PosterPaper_cameraready.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Edge AI Machine Perception Event Camera Artificial Intelligence and Robotics Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Edge AI
Machine Perception
Event Camera
Artificial Intelligence and Robotics
Databases and Information Systems
spellingShingle Edge AI
Machine Perception
Event Camera
Artificial Intelligence and Robotics
Databases and Information Systems
GOKARN, Ila Nitin
MISRA, Archan
Poster: Profiling event vision processing on edge devices
description As RGB camera resolutions and frame-rates improve, their increased energy requirements make it challenging to deploy fast, efficient, and low-power applications on edge devices. Newer classes of sensors, such as the biologically inspired neuromorphic event-based camera, capture only changes in light intensity per-pixel to achieve operational superiority in sensing latency (O(μs)), energy consumption (O(mW)), high dynamic range (140dB), and task accuracy such as in object tracking, over traditional RGB camera streams. However, highly dynamic scenes can yield an event rate of up to 12MEvents/second, the processing of which could overwhelm resource-constrained edge devices. Efficient processing of high volumes of event data is crucial for ultra-fast machine vision on edge devices. In this poster, we present a profiler that processes simulated event streams from RGB videos into 6 variants of framed representations for DNN inference on an NVIDIA Jetson Orin AGX, a representative edge device. The profiler evaluates the trade-offs between the volume of events evaluated, the quality of the processed event representation, and processing time to present the design choices available to an edge-scale event camera-based application observing the same RGB scenes. We believe that this analysis opens up the exploration of novel system designs for real-time low-power event vision on edge devices.
format text
author GOKARN, Ila Nitin
MISRA, Archan
author_facet GOKARN, Ila Nitin
MISRA, Archan
author_sort GOKARN, Ila Nitin
title Poster: Profiling event vision processing on edge devices
title_short Poster: Profiling event vision processing on edge devices
title_full Poster: Profiling event vision processing on edge devices
title_fullStr Poster: Profiling event vision processing on edge devices
title_full_unstemmed Poster: Profiling event vision processing on edge devices
title_sort poster: profiling event vision processing on edge devices
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/9228
https://ink.library.smu.edu.sg/context/sis_research/article/10228/viewcontent/Mobisys2024_ProfilingEventVision_PosterPaper_cameraready.pdf
_version_ 1814047812472537088