共查询到16条相似文献,搜索用时 78 毫秒
1.
延迟离散Hopfield网络的动态特征分析 总被引:3,自引:0,他引:3
神经网络的稳定性被认为是神经网络各种应用的基础.主要利用网络的状态转移方程和能量函数来研究带有延迟项的离散Hopfield神经网络动力学行为.给出了延迟离散Hopfield神经网络收敛于周期小于等于2的极限环的一些充分条件.给出了延迟网络收敛于周期为2和4的特殊极限环的一些充分条件.同时,得到了网络不存在任何稳定点的一些必要条件.所获结果不仅推广了一些已有的结论,而且为网络的应用提供了一定的理论基础. 相似文献
2.
3.
4.
离散时间Hopfield网络的动力系统分析 总被引:2,自引:0,他引:2
离散时间的Hopfield网络模型是一个非线性动力系统.对网络的状态变量引入新的能量函数,利用凸函数次梯度性质可以得到网络状态能量单调减少的条件.对于神经元的连接权值且激活函数单调非减(不一定严格单调增加)的Hopfield网络,若神经元激活函数的增益大于权值矩阵的最小特征值,则全并行时渐进收敛;而当网络串行时,只要网络中每个神经元激活函数的增益与该神经元的自反馈连接权值的和大于零即可.同时,若神经元激活函数单调,网络连接权值对称,利用凸函数次梯度的性质,证明了离散时间的Hopfield网络模型全并行时收敛到周期不大于2的极限环. 相似文献
5.
基于遗忘进化规划的Hopfield网学习算法 总被引:4,自引:0,他引:4
本文提出了一个基于遗忘进化规划的Hopfield网学习算法.通过遗忘部分个体,算法能避免局部最小.给定不动点、极限环或迭代序列,通过解不等式,算法能同时获得Hopfield网的拓扑结构和权值.该算法克服了进化Hopfield网学习的局限性.它还能找到多个优化解.实验也证明了该算法的有效性. 相似文献
6.
对人工神经网络ANN(Hopfield网络进行了分析,提出了Hopfield网络的稳定性问题,完善了Hopfield网络的学习算法。 相似文献
7.
Hopfield网络在优化计算中的应用 总被引:10,自引:1,他引:10
该文总结了Hopfield网络应用于优化计算的一般步骤和方法 ,并通过两个应用实例 :TSP问题 (旅行商问题 )和系统参数辨识问题 ,对应用Hopfield网络求解优化问题的关键步骤及应用方法进行了详细分析和说明 ,具有一定的普遍性和实用性。 相似文献
8.
9.
10.
11.
In this paper, chaos in a new class of three-dimensional continuous time Hopfield neural networks is investigated. Numerical experiments show that this class of Hopfield neural networks can have chaotic attractors and limit cycles for different parameter configurations. By virtue of horseshoes theory in dynamic systems, rigorous computer-assisted verifications are done for their chaotic behavior. In terms of topological entropy, quantitative interpretations of these neural networks’ complexity are given. A brief analysis is also presented about their robustness. 相似文献
12.
Runnian Ma Yu Xie Shengrui Zhang Wenbin Liu 《Computers & Mathematics with Applications》2009,57(11-12):1869
The discrete delayed Hopfield neural networks is an extension of the discrete Hopfield neural networks. In this paper, the convergence of discrete delayed Hopfield neural networks is mainly studied, and some results on the convergence are obtained by using Lyapunov function. Several new sufficient conditions for the delayed networks converging towards a limit cycle with period at most 2 are proved in parallel updating mode. Also, some conditions for the delayed networks converging towards a limit cycle with 2-period are investigated in parallel updating mode. All results established in this paper extend the previous results on the convergence of both the discrete Hopfield neural networks, and the discrete delayed Hopfield neural networks in parallel updating mode. 相似文献
13.
D L Lee 《Neural Networks, IEEE Transactions on》1999,10(4):975-978
Cernuschi-Frias proposed (IEEE Trans. Syst., Man, Cybern., vol.19, p.887-8, 1989) a partial simultaneous updating (PSU) mode for Hopfield networks. He also derived sufficient conditions to ensure global stability. In this letter, a counter-example is given to illustrate that the PSU sequence may converge to limited cycles even if one uses a connection matrix satisfying the Cernuschi-Frias conditions. Then, new sufficient conditions ensuring global convergence of a Hopfield network in PSU mode are derived. Compared with the result of fully parallel mode case, the new result permits a little relaxation on the lower bound of the main diagonal elements of the connection matrix. 相似文献
14.
15.
We consider networks of a large number of neurons (or units, processors, ...), whose dynamics are fully asynchronous with overlapping updating. We suppose that the neurons take a finite number of states (discrete states), and that the updating scheme is discrete in time. We make no hypotheses on the activation function of the neurons; the networks may have multiple cycles and basins. We derive conditions on the initialization of the networks, which ensures convergence to fixed points only. Application to a fully asynchronous Hopfield neural network allows us to validate our study. 相似文献
16.
小波Hopfield神经网络及其在优化中的应用 总被引:3,自引:1,他引:3
通过把Hopfield神经网络的sigmoid激励函数替换为Morlet小波函数,提出了一种新型的Hopfield神经网络——小波Hopfield神经网络(WHNN)。由于Morlet小波函数具有良好的局部逼近能力和较高的非线性度,因此WHNN在非线性函数寻优上表现出令人满意的较高精确度的效果。一个典型的函数优化例子表明小波Hopfield神经网络比Hopfield神经网络有较高的精确度。 相似文献