Learning two-layer neural networks with symmetric inputs
We give a new algorithm for learning a two-layer neural network under a very general class of input distributions. Assuming there is a ground-truth two-layer network $y = A \sigma(Wx) + \xi$, where A, W are weight matrices, $\xi$ represents noise, and the number of neurons in the hidden layer is no...
Saved in:
Main Authors: | GE, Rong, KUDITIPUDI, Rohith, LI, Zhize, WANG, Xiang |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2019
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8676 https://ink.library.smu.edu.sg/context/sis_research/article/9679/viewcontent/ICLR19_symmetric.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Incremental neural network training with an increasing input dimension
by: Guan, S.-U., et al.
Published: (2014) -
Multi-order Neurons for evolutionary higher order clustering and growth
by: RAMANATHAN, Kiruthika, et al.
Published: (2007) -
A Hubel Wiesel model of early concept generalization based on local correlation of input features
by: SADEGHI, Sepideh, et al.
Published: (2011) -
Self-organizing neural networks for learning air combat maneuvers
by: TENG, Teck-Hou, et al.
Published: (2012) -
Direct code access in self-organizing neural networks for reinforcement learning
by: TAN, Ah-hwee
Published: (2007)