首页 | 官方网站   微博 | 高级检索  
     


A Constructive Approach to Calculating Lower Entropy Bounds
Authors:Valeriu Beiu  Sorin Draghici  Thiery De Pauw
Affiliation:(1) Division NIS–1, Los Alamos National Laboratory, Mail Stop D466, Los Alamos, New Mexico, 87545, U.S.A;(2) Vision and Neural Networks Laboratory, Department of Computer Science, Wayne State University, 431 State Hall, Detroit, Michigan, 48202, U.S.A.>/aff>;(3) Départment de Mathématique, Université Catholique de Louvain, Chemin du Cyclotron, 2, B–1348 Louvain-la-Neuve, Belgium. E-mail
Abstract:This paper presents a constructive approach to estimating the size of a neural network necessary to solve a given classification problem. The results are derived using an information entropy approach in the context of limited precision integer weights. Such weights are particularly suited for hardware implementations since the area they occupy is limited, and the computations performed with them can be efficiently implemented in hardware. The considerations presented use an information entropy perspective and calculate lower bounds on the number of bits needed in order to solve a given classification problem. These bounds are obtained by approximating the classification hypervolumes with the volumes of several regular (i.e., highly symmetric) n-dimensional bodies. The bounds given here allow the user to choose the appropriate size of a neural network such that: (i) the given classification problem can be solved, and (ii) the network architecture is not oversized. All considerations presented take into account the restrictive case of limited precision integer weights, and therefore can be directly applied when designing VLSI implementations of neural networks.
Keywords:classification problems  complexity  n-dimensional volumes  number of bits  limited and integer weights  constructive algorithms
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号