Modelling a neural network using an algebraic method
In this paper, a framework based on algebraic structures to formalize various types of neural networks is presented. The working strategy is to break down neural networks into building blocks, relationships between each building block, and their operations. Building blocks are collections of primary...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal |
Published: |
2018
|
Online Access: | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84896511241&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/45262 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Chiang Mai University |
Summary: | In this paper, a framework based on algebraic structures to formalize various types of neural networks is presented. The working strategy is to break down neural networks into building blocks, relationships between each building block, and their operations. Building blocks are collections of primary components or neurons. In turn, neurons are collections of properties functioning as single entities, transforming an input into an output. We perceive a neuron as a function. Thus the flow of information in a neural network is a composition between functions. Moreover, we also define an abstract data structure called a layer which is a collection of entities which exist in the same time step. This layer concept allows the parallel computation of our model. There are two types of operation in our model; recalling operators and training operators. The recalling operators are operators that challenge the neural network with data. The training operators are operators that change parameters of neurons to fit with the data. This point of view means that all neural networks can be constructed or modelled using the same structures with different parameters. |
---|