Overcoming The Limitations of Neural Networks in Composite-Pattern Learning with Architopes

The effectiveness of neural networks in solving complex problems is well recognized; however, little is known about their limitations. We demonstrate that the feed-forward architecture, for most commonly used activation functions, is incapable of approximating functions comprised of multiple sub-patterns while simultaneously respecting their composite-pattern structure...

We overcome this bottleneck with a simple architecture modification that reallocates the neurons of any single feed-forward network across several smaller sub-networks, each specialized on a distinct part of the input-space. The modified architecture, called an Architope, is more expressive on two fronts. First, it is dense in an associated

 

 

To finish reading, please visit source site