Network Capsules for Deep Neural Networks - [Summary]

View this thread on: d.buzz | hive.blog | peakd.com | ecency.com
·@cristi·
0.000 HBD
Network Capsules for Deep Neural Networks - [Summary]
<center>![Resources #50.png](https://steemitimages.com/DQmc1SStaXWtNNChSEVfEzRQrS8wowV58KunyDntLMTbwCa/Resources%20%2350.png)</center>
___
A recent paper by [Hinton and colleagues (2017)](https://arxiv.org/abs/1710.09829v1) has brought some enthusiasm in the field of deep learning as it challenges the performance of CNNs:

_"We show that a discrimininatively trained, multi-layer capsule system achieves state-of-the-art performance on MNIST and is considerably better than a convolutional net at recognizing highly overlapping digits. "_ [[source](https://arxiv.org/abs/1710.09829v1)]

Eugenio Culurciello, at the _MLReview_ blog on Medium, has posted an insightful summary of the key points behind this research, which makes a good read for those not understanding technicalities in Hinton's paper. From the post:

_"Deep neural nets learn by back-propagation of errors over the entire network. In contrast real brains supposedly wire neurons by Hebbian principles: “units that fire together, wire together”. Capsules mimic Hebbian learning..."_ [[source](https://medium.com/mlreview/deep-neural-network-capsules-137be2877d44)]

Culurciello also discusses _pooling_ in standard deep neural nets versus the one in capsules (more dynamic), prediction, the resemblance of capsules to cortical columns in the human brain, and more. For illustration purposes, a high-level overview of the architecture of the capsule network is provided. 

So, if you don't have time to fully digest Hinton's paper, Culurciello's summary is a good read:

<center>[Network Capsules for Deep Neural Networks - [Summary]](https://medium.com/mlreview/deep-neural-network-capsules-137be2877d44)</center>
___
### <center>To stay in touch with me, follow @cristi</center>   
___

[Cristi Vlad](http://cristivlad.com) Self-Experimenter and Author
👍 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,