首页 | 官方网站   微博 | 高级检索  
     


Parallel evolutionary training algorithms for “hardware-friendly” neural networks
Authors:Plagianakos  Vassilis P  Vrahatis  Michael N
Affiliation:(1) Department of Mathematics and Artificial Intelligence Research Center–UPAIRC, University of Patras, GR-26110 Patras, Greece;(2) Department of Mathematics, University of Patras, GR-26110 Patras, Greece;(3) University of Patras Artificial Intelligence Research Center-UPAIRC, Greece
Abstract:In this paper, Parallel Evolutionary Algorithms for integer weightneural network training are presented. To this end, each processoris assigned a subpopulation of potential solutions. Thesubpopulations are independently evolved in parallel andoccasional migration is employed to allow cooperation betweenthem. The proposed algorithms are applied to train neural networksusing threshold activation functions and weight values confined toa narrow band of integers. We constrain the weights and biases inthe range –3, 3], thus they can be represented by just 3 bits.Such neural networks are better suited for hardware implementationthan the real weight ones. These algorithms have been designedkeeping in mind that the resulting integer weights require lessbits to be stored and the digital arithmetic operations betweenthem are easier to be implemented in hardware. Another advantageof the proposed evolutionary strategies is that they are capableof continuing the training process ``on-chip', if needed. Ourintention is to present results of parallel evolutionaryalgorithms on this difficult task. Based on the application of theproposed class of methods on classical neural network problems,our experience is that these methods are effective and reliable.
Keywords:``hardware-friendly' implementations  integer weight neural networks  ``on-chip' training  parallel differential evolution algorithms  threshold activation functions
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号