共查询到18条相似文献,搜索用时 156 毫秒
1.
本文提出了一种异联想记忆模型的优化学习算法.首先,我们将反映神经元网络性能的标准转化为一个易于控制的代价函数,从而将权值的确定过程自然地转化为一个全局最优化过程.优化过程采用了梯度下降技术.这种学习算法可以保证每个训练模式成为系统的稳定吸引子,并且具有优化意义上的最大吸引域.在理论上,我们讨论了异联想记忆模型的存储能力,训练模式的渐近稳定性和吸引域的范围.计算机实验结果充分说明了算法的有效性. 相似文献
2.
基于联想记忆各记忆模式的吸引域之间应保持大小平衡的思想.提出了设计Hopfield联想记忆网络的极大极小准则,即设计出的对称连接权阵应使得网络最小的记忆模式吸引域达到最大.首先提出了一种快速学习算法;再发展了一个启发性迭代学习算法,称为约束感知器优化学习算法.大量实验结果表明了本文学习算法的优越性. 相似文献
3.
4.
5.
针对 Kosko提出的最大最小模糊联想记忆网络存在的问题 ,通过对这种网络连接权学习规则的改进 ,给出了另一种权重学习规则 ,即把 Kosko的前馈模糊联想记忆模型发展成为模糊双向联想记忆模型 ,并由此给出了模糊快速增强学习算法 ,该算法能存储任意给定的多值训练模式对集 .其中对于存储二值模式对集 ,由于其连接权值取值 0或 1,因而该算法易于硬件电路和光学实现 .实验结果表明 ,模糊快速增强学习算法是行之有效的 . 相似文献
6.
7.
联想记忆网络的约束优化学习 总被引:2,自引:0,他引:2
本文提出了一种联想记忆网络的约束优化学习算法,学习算法是一个全局最小化过程,其初始解保证每个样本是系统的稳定状态,然后逐步增大样本的吸引域,使网络具有优化意义上的最大吸引域,在理论上,我们分析了样本的渐近稳定性和吸引域范围,以及学习算法的收敛性,大量计算机实验结果说明学习算法是行之有效的。 相似文献
8.
提出了一个新的高阶双向联想记忆模型.它推广了由Tai及Jeng所提出的高阶双向联想记忆模型HOBAM(higher-order bidirectional associative memory)及修正的具有内连接的双向联想记忆模型MIBAM(modified intraconnected bidirectional associative memory),通过定义能量函数,证明了新模型在同步与异步更新方式下的稳定性,从而能够保证所有被训练模式对成为该模型的渐近稳定点.借助统计分析原理,估计了所提模型的存储容量.计算机模拟证实此模型不仅具有较高的存储容量,而且还具有较好的纠错能力. 相似文献
9.
基于联想记忆各记忆模式的吸收域之间的应保持大小平衡的思想,提出了设计Hopfield联想记忆网络的极大极小准则,即设计出的对称连接权阵应使得网络最小的记忆模式吸收域达到最大,首选提出了一种快速算法;再发展了一个启发性迭代学习算法,称为约束感知器学习算法,大量实验结果表明了本文学习算法的优越性。 相似文献
10.
11.
J. Ma 《Neural computing & applications》1999,8(1):25-32
We present a study of generalised Hopfield networks for associative memory. By analysing the radius of attraction of a stable
state, the Object Perceptron Learning Algorithm (OPLA) and OPLA scheme are proposed to store a set of sample patterns (vectors)
in a generalised Hopfield network with their radii of attraction as large as we require. OPLA modifies a set of weights and
a threshold in a way similar to the perceptron learning algorithm. The simulation results show that the OPLA scheme is more
effective for associative memory than both the sum-of-outer produce scheme with a Hopfield network and the weighted sum-of-outer
product scheme with an asymmetric Hopfield network. 相似文献
12.
本文对双向联想记忆(BAM)的学习与回忆过程进行了详细的分析。在学习过程中,先是运用自适应非对称BAM算法进行学习,进而采用设置印象门限的反复记忆算法进行学习,本文从理论上证明了印象门限与样本吸引域之间的关系,指出反复记忆方法的理论依据。回忆过程中,采用非零阈值函数的运行方程,提出了阈值学习方法,并且从理论上证明了非零阈值函数的运行方程的采用,可进一步扩大吸引域。为了进一步扩大网络的信息存储量,本文引入了并联的BAM结构。本文方法的采纳,使得BAM网络的信息存储量、误差校正能力等得到很大程度的提高。 相似文献
13.
The paper offers a new kind of neural network for classifying binary patterns. Given the dimensionality of patterns, the memory
capacity of the network grows exponentially with free parameter s. The paper considers the limitations for parameter s caused by the fact that greater values of demand large computer memory and decrease the basin of attraction we have. In contrast
to similar models, the network enjoys larger memory capacity and better recognition capabilities—it can distinguish heavily
distorted patterns and even cope with pattern correlation. The negative effect of the latter can be easily suppressed by taking
a large enough value of s. A perceptron recognition system is considered to demonstrate the efficiency of the algorithm, yet the method is quite applicable
in fully connected associative-memory networks.
The article is published in the original. 相似文献
14.
15.
本文得到了若干关于模拟反馈联想记忆各记忆模式的吸引域及其中每一点趋向相应记忆模式的指数收敛速度的估计结果,它们可用于高效模拟反馈联想记忆的性能评价以及综合过程. 相似文献
16.
In a previous paper, the self-trapping network (STN) was introduced as more biologically realistic than attractor neural networks (ANNs) based on the Ising model. This paper extends the previous analysis of a one-dimensional (1-D) STN storing a single memory to a model that stores multiple memories and that possesses generalized sparse connectivity. The energy, Lyapunov function, and partition function derived for the 1-D model are generalized to the case of an attractor network with only near-neighbor synapses, coupled to a system that computes memory overlaps. Simulations reveal that 1) the STN dramatically reduces intra-ANN connectivity without severly affecting the size of basins of attraction, with fast self-trapping able to sustain attractors even in the absence of intra-ANN synapses; 2) the basins of attraction can be controlled by a single free parameter, providing natural attention-like effects; 3) the same parameter determines the memory capacity of the network, and the latter is much less dependent than a standard ANN on the noise level of the system; 4) the STN serves as a useful memory for some correlated memory patterns for which the standard ANN totally fails; 5) the STN can store a large number of sparse patterns; and 6) a Monte Carlo procedure, a competitive neural network, and binary neurons with thresholds can be used to induce self-trapping. 相似文献
17.
Jyh-Yeong Chang Chien-Wen Cho 《IEEE transactions on systems, man, and cybernetics. Part A, Systems and humans : a publication of the IEEE Systems, Man, and Cybernetics Society》2003,33(4):421-428
Bidirectional associative memory (BAM) generalizes the associative memory (AM) to be capable of performing two-way recalling of pattern pairs. Asymmetric bidirectional associative memory (ABAM) is a variant of BAM relaxed with connection weight symmetry restriction and enjoys a much better performance than a conventional BAM structure. Higher-order associative memories (HOAMs) are reputed for their higher memory capacity than the first-order counterparts. The paper concerns the design of a second-order asymmetric bidirectional associative memory (SOABAM) with a maximal basin of attraction, whose extension to a HOABAM is possible and straightforward. First, a necessary and sufficient condition is derived for the connection weight matrix of SOABAM that can guarantee the recall of all prototype pattern pairs. A local training rule which is adaptive in the learning step size is formulated. Then derived is a theorem, designing a SOABAM further enlarging the quantities required to meet the complete recall theorem will enhance the capability of evolving a noisy pattern to converge to its association pattern vector without error. Based on this theorem, our algorithm is also modified to ensure each training pattern is stored with a basin of attraction as large as possible. 相似文献
18.
In this article we present techniques for designing associative memories to be implemented by a class of synchronous discrete-time neural networks based on a generalization of the brain-state-in-a-box neural model. First, we address the local qualitative properties and global qualitative aspects of the class of neural networks considered. Our approach to the stability analysis of the equilibrium points of the network gives insight into the extent of the domain of attraction for the patterns to be stored as asymptotically stable equilibrium points and is useful in the analysis of the retrieval performance of the network and also for design purposes. By making use of the analysis results as constraints, the design for associative memory is performed by solving a constraint optimization problem whereby each of the stored patterns is guaranteed a substantial domain of attraction. The performance of the designed network is illustrated by means of three specific examples. 相似文献