首页 | 官方网站   微博 | 高级检索  
     


Limitations of multi-layer perceptron networks - steps towards genetic neural networks
Authors:H Mü  hlenbein
Affiliation:

Gesellschaft für Mathematik und Datenverarbeitung mbH, Postfach 1240, D-5205, St. Augustin, F.R. Germany

Abstract:In this paper we investigate multi-layer perceptron networks in the task domain of Boolean functions. We demystify the multi-layer perceptron network by showing that it just divides the input space into regions constrained by hyperplanes. We use this information to construct minimal training sets. Despite using minimal training sets, the learning time of multi-layer perceptron networks with backpropagation scales exponentially for complex Boolean functions. But modular neural networks which consist of independentky trained subnetworks scale very well. We conjecture that the next generation of neural networks will be genetic neural networks which evolve their structure. We confirm Minsky and Papert: “The future of neural networks is tied not to the search for some single, universal scheme to solve all problems at once, bu to the evolution of a many-faceted technology of network design.”
Keywords:Learning   Boolean functions   minimal training set   modular neural network
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号