Optimization of neural networks through high level synthesis
With the increasing popularity of machine learning, coupled with increasing computing power, the field of machine learning algorithms has grown to be a very dynamic and fast-growing one. The effectiveness of such applications has led to concerted efforts to embed such applications into other s...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2018
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/76135 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | With the increasing popularity of machine learning, coupled with increasing computing power, the
field of machine learning algorithms has grown to be a very dynamic and fast-growing one. The
effectiveness of such applications has led to concerted efforts to embed such applications into other
systems. However, such a drawback of machine learning algorithms is the humongous
computational and space complexity, requiring large amounts of power and/or physical size to run.
In embedded systems, these issues pose a problem, as size and performance are key constraints.
However, optimizing such solutions require engineering at the Register Transfer Level (RTL),
which is time-consuming and error-prone. In such implementations, it may be acceptable to accept
a solution that does the job well enough, instead of one that is optimized down to the last bit
through RTL designs.
In this report, we have implemented a small-scale machine learning model, trained offline in
Python, a Convolutional Neural Network (CNN) onto an Field-Programmable Gate Array, the
Zedboard. This report explores the combinations of compiler directives or compiler pragmas,
which are interpreted by the High-Level Synthesis (HLS) compiler. Under these directives, the
designer can affect how the solution is implemented, and can improve the space and computational
complexity. |
---|