首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到18条相似文献,搜索用时 156 毫秒
1.
汪涛  庄新华 《计算机学报》1993,16(2):97-105
本文提出了一种异联想记忆模型的优化学习算法.首先,我们将反映神经元网络性能的标准转化为一个易于控制的代价函数,从而将权值的确定过程自然地转化为一个全局最优化过程.优化过程采用了梯度下降技术.这种学习算法可以保证每个训练模式成为系统的稳定吸引子,并且具有优化意义上的最大吸引域.在理论上,我们讨论了异联想记忆模型的存储能力,训练模式的渐近稳定性和吸引域的范围.计算机实验结果充分说明了算法的有效性.  相似文献   

2.
梁学斌  吴立德 《软件学报》1996,7(Z1):267-272
基于联想记忆各记忆模式的吸引域之间应保持大小平衡的思想.提出了设计Hopfield联想记忆网络的极大极小准则,即设计出的对称连接权阵应使得网络最小的记忆模式吸引域达到最大.首先提出了一种快速学习算法;再发展了一个启发性迭代学习算法,称为约束感知器优化学习算法.大量实验结果表明了本文学习算法的优越性.  相似文献   

3.
具有期望容错域的前向掩蔽联想记忆模型的设计方法   总被引:2,自引:0,他引:2  
联想记忆的综合问题是目前没有很好解决的难题.文中用作者提出的通用前馈网络和排序学习算法,提出了一种设计具有期望容错域的前向掩蔽联想记忆模型的方法.该方法一般性地解决了信息空间上联想记忆的综合难题,使设计出的联想记忆模型具有任意期望的记忆样本容错域.  相似文献   

4.
基于约束区域的连续时间联想记忆神经网络   总被引:2,自引:2,他引:0  
陶卿  方廷健  孙德敏 《计算机学报》1999,22(12):1253-1258
传统的联想记忆神经网络模型是根据联想记忆点设计权值。文中提出一种根据联想记忆点设计基于约束区域的神经网络模型,它保证了渐近稳定的平衡点集与样要点集相同,不渐近稳定的平衡点恰为实际的拒识状态,并且吸引域分布合理。它具有学习和遗忘能力,还具有记忆容量大和电路可实现优点,是理想的联想记忆器。  相似文献   

5.
模糊联想记忆网络的增强学习算法   总被引:6,自引:0,他引:6       下载免费PDF全文
针对 Kosko提出的最大最小模糊联想记忆网络存在的问题 ,通过对这种网络连接权学习规则的改进 ,给出了另一种权重学习规则 ,即把 Kosko的前馈模糊联想记忆模型发展成为模糊双向联想记忆模型 ,并由此给出了模糊快速增强学习算法 ,该算法能存储任意给定的多值训练模式对集 .其中对于存储二值模式对集 ,由于其连接权值取值 0或 1,因而该算法易于硬件电路和光学实现 .实验结果表明 ,模糊快速增强学习算法是行之有效的 .  相似文献   

6.
时变容错域的感知联想记忆模型及其实现算法   总被引:2,自引:0,他引:2  
提出并用软件实现了一种时变容错域的感知联想记忆模型.该模型有以下特点:(1)模拟了大脑联想记忆的容错域随时间变化的特点.模型可以根据记忆样本的重要性,为记忆样本在不同时间设计适当的容错域;(2)实现了”维到m维空间的无穷值模式非线性联想,而且该模型的样本容错域充满整个实空间R^n,模型没有伪吸引子;(3)联想记忆速度快.  相似文献   

7.
联想记忆网络的约束优化学习   总被引:2,自引:0,他引:2  
汪涛  俞瑞钊 《计算机学报》1995,18(12):886-892
本文提出了一种联想记忆网络的约束优化学习算法,学习算法是一个全局最小化过程,其初始解保证每个样本是系统的稳定状态,然后逐步增大样本的吸引域,使网络具有优化意义上的最大吸引域,在理论上,我们分析了样本的渐近稳定性和吸引域范围,以及学习算法的收敛性,大量计算机实验结果说明学习算法是行之有效的。  相似文献   

8.
陈松灿  朱梧 《软件学报》1998,9(11):814-819
提出了一个新的高阶双向联想记忆模型.它推广了由Tai及Jeng所提出的高阶双向联想记忆模型HOBAM(higher-order bidirectional associative memory)及修正的具有内连接的双向联想记忆模型MIBAM(modified intraconnected bidirectional associative memory),通过定义能量函数,证明了新模型在同步与异步更新方式下的稳定性,从而能够保证所有被训练模式对成为该模型的渐近稳定点.借助统计分析原理,估计了所提模型的存储容量.计算机模拟证实此模型不仅具有较高的存储容量,而且还具有较好的纠错能力.  相似文献   

9.
梁学斌  吴立德 《软件学报》1996,7(A00):267-272
基于联想记忆各记忆模式的吸收域之间的应保持大小平衡的思想,提出了设计Hopfield联想记忆网络的极大极小准则,即设计出的对称连接权阵应使得网络最小的记忆模式吸收域达到最大,首选提出了一种快速算法;再发展了一个启发性迭代学习算法,称为约束感知器学习算法,大量实验结果表明了本文学习算法的优越性。  相似文献   

10.
由于一般离散Hopfield神经网络存在很多伪稳定点.使稳定点的吸引域变小.网络很难获得真正的最优解.因此,提出将遗传算法应用到Hopfield联想记忆神经网络中.利用遗传算法对复杂、多峰、非线性极不可微函数实现全局搜索性质.对Hopfield联想记忆吸引域进行优化,使待联想模式跳出伪模式的吸引域.使Hopfield网络在较高噪信比的情况下保持较高的联想成功率.仿真结果证明了该方法的有效性.  相似文献   

11.
We present a study of generalised Hopfield networks for associative memory. By analysing the radius of attraction of a stable state, the Object Perceptron Learning Algorithm (OPLA) and OPLA scheme are proposed to store a set of sample patterns (vectors) in a generalised Hopfield network with their radii of attraction as large as we require. OPLA modifies a set of weights and a threshold in a way similar to the perceptron learning algorithm. The simulation results show that the OPLA scheme is more effective for associative memory than both the sum-of-outer produce scheme with a Hopfield network and the weighted sum-of-outer product scheme with an asymmetric Hopfield network.  相似文献   

12.
本文对双向联想记忆(BAM)的学习与回忆过程进行了详细的分析。在学习过程中,先是运用自适应非对称BAM算法进行学习,进而采用设置印象门限的反复记忆算法进行学习,本文从理论上证明了印象门限与样本吸引域之间的关系,指出反复记忆方法的理论依据。回忆过程中,采用非零阈值函数的运行方程,提出了阈值学习方法,并且从理论上证明了非零阈值函数的运行方程的采用,可进一步扩大吸引域。为了进一步扩大网络的信息存储量,本文引入了并联的BAM结构。本文方法的采纳,使得BAM网络的信息存储量、误差校正能力等得到很大程度的提高。  相似文献   

13.
The paper offers a new kind of neural network for classifying binary patterns. Given the dimensionality of patterns, the memory capacity of the network grows exponentially with free parameter s. The paper considers the limitations for parameter s caused by the fact that greater values of demand large computer memory and decrease the basin of attraction we have. In contrast to similar models, the network enjoys larger memory capacity and better recognition capabilities—it can distinguish heavily distorted patterns and even cope with pattern correlation. The negative effect of the latter can be easily suppressed by taking a large enough value of s. A perceptron recognition system is considered to demonstrate the efficiency of the algorithm, yet the method is quite applicable in fully connected associative-memory networks. The article is published in the original.  相似文献   

14.
基于约束区域的BSB联想记忆模型   总被引:2,自引:0,他引:2  
提出一种根据联想记忆点设计基于约束区域的BSB(Brain-State-inm-a-Box)神经网络模型,它保证了渐近稳定的平衡点集与样本点集相同,不渐近稳定的平衡点恰为实际的拒识状态,并且吸引域分布合理,从而将ESB完善为理想的联想记忆器。  相似文献   

15.
本文得到了若干关于模拟反馈联想记忆各记忆模式的吸引域及其中每一点趋向相应记忆模式的指数收敛速度的估计结果,它们可用于高效模拟反馈联想记忆的性能评价以及综合过程.  相似文献   

16.
In a previous paper, the self-trapping network (STN) was introduced as more biologically realistic than attractor neural networks (ANNs) based on the Ising model. This paper extends the previous analysis of a one-dimensional (1-D) STN storing a single memory to a model that stores multiple memories and that possesses generalized sparse connectivity. The energy, Lyapunov function, and partition function derived for the 1-D model are generalized to the case of an attractor network with only near-neighbor synapses, coupled to a system that computes memory overlaps. Simulations reveal that 1) the STN dramatically reduces intra-ANN connectivity without severly affecting the size of basins of attraction, with fast self-trapping able to sustain attractors even in the absence of intra-ANN synapses; 2) the basins of attraction can be controlled by a single free parameter, providing natural attention-like effects; 3) the same parameter determines the memory capacity of the network, and the latter is much less dependent than a standard ANN on the noise level of the system; 4) the STN serves as a useful memory for some correlated memory patterns for which the standard ANN totally fails; 5) the STN can store a large number of sparse patterns; and 6) a Monte Carlo procedure, a competitive neural network, and binary neurons with thresholds can be used to induce self-trapping.  相似文献   

17.
Bidirectional associative memory (BAM) generalizes the associative memory (AM) to be capable of performing two-way recalling of pattern pairs. Asymmetric bidirectional associative memory (ABAM) is a variant of BAM relaxed with connection weight symmetry restriction and enjoys a much better performance than a conventional BAM structure. Higher-order associative memories (HOAMs) are reputed for their higher memory capacity than the first-order counterparts. The paper concerns the design of a second-order asymmetric bidirectional associative memory (SOABAM) with a maximal basin of attraction, whose extension to a HOABAM is possible and straightforward. First, a necessary and sufficient condition is derived for the connection weight matrix of SOABAM that can guarantee the recall of all prototype pattern pairs. A local training rule which is adaptive in the learning step size is formulated. Then derived is a theorem, designing a SOABAM further enlarging the quantities required to meet the complete recall theorem will enhance the capability of evolving a noisy pattern to converge to its association pattern vector without error. Based on this theorem, our algorithm is also modified to ensure each training pattern is stored with a basin of attraction as large as possible.  相似文献   

18.
In this article we present techniques for designing associative memories to be implemented by a class of synchronous discrete-time neural networks based on a generalization of the brain-state-in-a-box neural model. First, we address the local qualitative properties and global qualitative aspects of the class of neural networks considered. Our approach to the stability analysis of the equilibrium points of the network gives insight into the extent of the domain of attraction for the patterns to be stored as asymptotically stable equilibrium points and is useful in the analysis of the retrieval performance of the network and also for design purposes. By making use of the analysis results as constraints, the design for associative memory is performed by solving a constraint optimization problem whereby each of the stored patterns is guaranteed a substantial domain of attraction. The performance of the designed network is illustrated by means of three specific examples.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号